Re: [PATCH] crypto: x86/aes-ni: fix AVX detection
From: Maxim Levitsky
Date: Wed Nov 03 2021 - 08:50:11 EST
On Wed, 2021-11-03 at 14:46 +0200, Maxim Levitsky wrote:
> Fix two semi-theoretical issues that are present.
>
> 1. AVX is assumed to be present when AVX2 is present.
> That can be false in a VM.
> This can be considered a hypervisor bug,
> but the kernel should not crash in this case if this is possible.
>
> 2. YMM state can be soft disabled in XCR0.
>
> Fix both issues by using 'cpu_has_xfeatures(XFEATURE_MASK_YMM')
> to check for usable AVX support.
>
> Fixes: d764593af9249 ("crypto: aesni - AVX and AVX2 version of AESNI-GCM encode and decode")
>
> Signed-off-by: Maxim Levitsky <mlevitsk@xxxxxxxxxx>
I forgot to mention that Paolo Bonzini helped me with this patch,
especially with the way to detect XCR0 bits.
Best regards,
Maxim Levitsky
> ---
> arch/x86/crypto/aesni-intel_glue.c | 25 +++++++++++++------------
> 1 file changed, 13 insertions(+), 12 deletions(-)
>
> diff --git a/arch/x86/crypto/aesni-intel_glue.c b/arch/x86/crypto/aesni-intel_glue.c
> index 0fc961bef299c..20db1e500ef6f 100644
> --- a/arch/x86/crypto/aesni-intel_glue.c
> +++ b/arch/x86/crypto/aesni-intel_glue.c
> @@ -1147,24 +1147,25 @@ static int __init aesni_init(void)
> if (!x86_match_cpu(aesni_cpu_id))
> return -ENODEV;
> #ifdef CONFIG_X86_64
> - if (boot_cpu_has(X86_FEATURE_AVX2)) {
> - pr_info("AVX2 version of gcm_enc/dec engaged.\n");
> - static_branch_enable(&gcm_use_avx);
> - static_branch_enable(&gcm_use_avx2);
> - } else
> - if (boot_cpu_has(X86_FEATURE_AVX)) {
> - pr_info("AVX version of gcm_enc/dec engaged.\n");
> + if (cpu_has_xfeatures(XFEATURE_MASK_YMM, NULL)) {
> +
> static_branch_enable(&gcm_use_avx);
> - } else {
> - pr_info("SSE version of gcm_enc/dec engaged.\n");
> - }
> - if (boot_cpu_has(X86_FEATURE_AVX)) {
> +
> + if (boot_cpu_has(X86_FEATURE_AVX2)) {
> + static_branch_enable(&gcm_use_avx2);
> + pr_info("AVX2 version of gcm_enc/dec engaged.\n");
> + } else {
> + pr_info("AVX version of gcm_enc/dec engaged.\n");
> + }
> +
> /* optimize performance of ctr mode encryption transform */
> static_call_update(aesni_ctr_enc_tfm, aesni_ctr_enc_avx_tfm);
> pr_info("AES CTR mode by8 optimization enabled\n");
> +
> + } else {
> + pr_info("SSE version of gcm_enc/dec engaged.\n");
> }
> #endif
> -
> err = crypto_register_alg(&aesni_cipher_alg);
> if (err)
> return err;