Re: [PATCH] arm64: kaslr: Reserve size of ARM64_MEMSTART_ALIGN in linear region

From: Yueyi Li
Date: Mon Dec 24 2018 - 21:30:52 EST


Hi Ard,


On 2018/12/24 17:45, Ard Biesheuvel wrote:
> Does the following change fix your issue as well?
>
> index 9b432d9fcada..9dcf0ff75a11 100644
> --- a/arch/arm64/mm/init.c
> +++ b/arch/arm64/mm/init.c
> @@ -447,7 +447,7 @@ void __init arm64_memblock_init(void)
> * memory spans, randomize the linear region as well.
> */
> if (memstart_offset_seed > 0 && range >= ARM64_MEMSTART_ALIGN) {
> - range = range / ARM64_MEMSTART_ALIGN + 1;
> + range /= ARM64_MEMSTART_ALIGN;
> memstart_addr -= ARM64_MEMSTART_ALIGN *
> ((range * memstart_offset_seed) >> 16);
> }

Yes, it can fix this also. I just think modify the first *range*
calculation would be easier to
grasp, what do you think?



Thanks,
Yueyi