Re: [PATCH bpf-next 06/13] uprobes/x86: Add uprobe syscall to speed up uprobe
From: Thomas Weißschuh
Date: Sat Dec 14 2024 - 08:22:01 EST
On 2024-12-13 22:52:15+0100, Jiri Olsa wrote:
> On Fri, Dec 13, 2024 at 04:12:46PM +0100, Thomas Weißschuh wrote:
>
> SNIP
>
> > > > > +static int __init arch_uprobes_init(void)
> > > > > +{
> > > > > + unsigned long size = uprobe_trampoline_end - uprobe_trampoline_entry;
> > > > > + static struct page *pages[2];
> > > > > + struct page *page;
> > > > > +
> > > > > + page = alloc_page(GFP_HIGHUSER);
> > > >
> > > > That page could be in static memory, removing the need for the explicit
> > > > allocation. It could also be __ro_after_init.
> > > > Then tramp_mapping itself can be const.
> > >
> > > hum, how would that look like? I think that to get proper page object
> > > you have to call alloc_page or some other page alloc family function..
> > > what do I miss?
> >
> > static u8 trampoline_page[PAGE_SIZE] __ro_after_init __aligned(PAGE_SIZE);
> > static struct page *tramp_mapping_pages[2] __ro_after_init;
> >
> > static const struct vm_special_mapping tramp_mapping = {
> > .name = "[uprobes-trampoline]",
> > .pages = tramp_mapping_pages,
> > .mremap = tramp_mremap,
> > };
> >
> > static int __init arch_uprobes_init(void)
> > {
> > ...
> > trampoline_pages[0] = virt_to_page(trampoline_page);
> > ...
> > }
> >
> > Untested, but it's similar to the stuff the vDSO implementations are
> > doing which I am working with at the moment.
>
> nice idea, better than allocating the page, will do that
Or even better yet, just allocate the whole page already in the inline
asm and avoid the copying, too:
diff --git a/arch/x86/kernel/uprobes.c b/arch/x86/kernel/uprobes.c
index b2420eeee23a..c5e6ca7f998a 100644
--- a/arch/x86/kernel/uprobes.c
+++ b/arch/x86/kernel/uprobes.c
@@ -462,7 +462,7 @@ SYSCALL_DEFINE0(uprobe)
asm (
".pushsection .rodata\n"
- ".global uprobe_trampoline_entry\n"
+ ".balign " __stringify(PAGE_SIZE) "\n"
"uprobe_trampoline_entry:\n"
"endbr64\n"
"push %rcx\n"
@@ -474,13 +474,11 @@ asm (
"pop %r11\n"
"pop %rcx\n"
"ret\n"
- ".global uprobe_trampoline_end\n"
- "uprobe_trampoline_end:\n"
+ ".balign " __stringify(PAGE_SIZE) "\n"
".popsection\n"
);
-extern __visible u8 uprobe_trampoline_entry[];
-extern __visible u8 uprobe_trampoline_end[];
+extern u8 uprobe_trampoline_entry[];
If you want to keep the copying for some reason, the asm code should be
in the section ".init.rodata" as its not used afterwards.
(A bit bikesheddy, I admit)
Thomas