Re: [PATCH v4 23/27] x86_64: assembly, change all ENTRY+ENDPROC to SYM_FUNC_*

From: Rafael J. Wysocki
Date: Mon Oct 02 2017 - 08:40:08 EST


On Monday, October 2, 2017 11:12:42 AM CEST Jiri Slaby wrote:
> These are all functions which are invoked from elsewhere, so we annotate
> them as global using the new SYM_FUNC_START. And their ENDPROC's by
> SYM_FUNC_END.
>
> And make sure ENTRY/ENDPROC is not defined on X86_64, given these were
> the last users.
>
> Signed-off-by: Jiri Slaby <jslaby@xxxxxxx>
> Cc: "H. Peter Anvin" <hpa@xxxxxxxxx>
> Cc: Thomas Gleixner <tglx@xxxxxxxxxxxxx>
> Cc: Ingo Molnar <mingo@xxxxxxxxxx>
> Cc: x86@xxxxxxxxxx
> Cc: Herbert Xu <herbert@xxxxxxxxxxxxxxxxxxx>
> Cc: "David S. Miller" <davem@xxxxxxxxxxxxx>
> Cc: "Rafael J. Wysocki" <rjw@xxxxxxxxxxxxx>
> Cc: Len Brown <len.brown@xxxxxxxxx>
> Cc: Pavel Machek <pavel@xxxxxx>
> Cc: Matt Fleming <matt@xxxxxxxxxxxxxxxxxxx>
> Cc: Ard Biesheuvel <ard.biesheuvel@xxxxxxxxxx>
> Cc: Boris Ostrovsky <boris.ostrovsky@xxxxxxxxxx>
> Cc: Juergen Gross <jgross@xxxxxxxx>
> Cc: linux-crypto@xxxxxxxxxxxxxxx
> Cc: linux-pm@xxxxxxxxxxxxxxx
> Cc: linux-efi@xxxxxxxxxxxxxxx
> Cc: xen-devel@xxxxxxxxxxxxxxxxxxxx
> ---
> arch/x86/boot/compressed/efi_thunk_64.S | 4 +-
> arch/x86/boot/compressed/head_64.S | 16 ++++----
> arch/x86/crypto/aes-i586-asm_32.S | 8 ++--
> arch/x86/crypto/aes-x86_64-asm_64.S | 4 +-
> arch/x86/crypto/aes_ctrby8_avx-x86_64.S | 12 +++---
> arch/x86/crypto/aesni-intel_asm.S | 44 +++++++++++-----------
> arch/x86/crypto/aesni-intel_avx-x86_64.S | 24 ++++++------
> arch/x86/crypto/blowfish-x86_64-asm_64.S | 16 ++++----
> arch/x86/crypto/camellia-aesni-avx-asm_64.S | 24 ++++++------
> arch/x86/crypto/camellia-aesni-avx2-asm_64.S | 24 ++++++------
> arch/x86/crypto/camellia-x86_64-asm_64.S | 16 ++++----
> arch/x86/crypto/cast5-avx-x86_64-asm_64.S | 16 ++++----
> arch/x86/crypto/cast6-avx-x86_64-asm_64.S | 24 ++++++------
> arch/x86/crypto/chacha20-avx2-x86_64.S | 4 +-
> arch/x86/crypto/chacha20-ssse3-x86_64.S | 8 ++--
> arch/x86/crypto/crc32-pclmul_asm.S | 4 +-
> arch/x86/crypto/crc32c-pcl-intel-asm_64.S | 4 +-
> arch/x86/crypto/crct10dif-pcl-asm_64.S | 4 +-
> arch/x86/crypto/des3_ede-asm_64.S | 8 ++--
> arch/x86/crypto/ghash-clmulni-intel_asm.S | 8 ++--
> arch/x86/crypto/poly1305-avx2-x86_64.S | 4 +-
> arch/x86/crypto/poly1305-sse2-x86_64.S | 8 ++--
> arch/x86/crypto/salsa20-x86_64-asm_64.S | 12 +++---
> arch/x86/crypto/serpent-avx-x86_64-asm_64.S | 24 ++++++------
> arch/x86/crypto/serpent-avx2-asm_64.S | 24 ++++++------
> arch/x86/crypto/serpent-sse2-x86_64-asm_64.S | 8 ++--
> arch/x86/crypto/sha1-mb/sha1_mb_mgr_flush_avx2.S | 8 ++--
> arch/x86/crypto/sha1-mb/sha1_mb_mgr_submit_avx2.S | 4 +-
> arch/x86/crypto/sha1-mb/sha1_x8_avx2.S | 4 +-
> arch/x86/crypto/sha1_avx2_x86_64_asm.S | 4 +-
> arch/x86/crypto/sha1_ni_asm.S | 4 +-
> arch/x86/crypto/sha1_ssse3_asm.S | 4 +-
> arch/x86/crypto/sha256-avx-asm.S | 4 +-
> arch/x86/crypto/sha256-avx2-asm.S | 4 +-
> .../crypto/sha256-mb/sha256_mb_mgr_flush_avx2.S | 8 ++--
> .../crypto/sha256-mb/sha256_mb_mgr_submit_avx2.S | 4 +-
> arch/x86/crypto/sha256-mb/sha256_x8_avx2.S | 4 +-
> arch/x86/crypto/sha256-ssse3-asm.S | 4 +-
> arch/x86/crypto/sha256_ni_asm.S | 4 +-
> arch/x86/crypto/sha512-avx-asm.S | 4 +-
> arch/x86/crypto/sha512-avx2-asm.S | 4 +-
> .../crypto/sha512-mb/sha512_mb_mgr_flush_avx2.S | 8 ++--
> .../crypto/sha512-mb/sha512_mb_mgr_submit_avx2.S | 4 +-
> arch/x86/crypto/sha512-mb/sha512_x4_avx2.S | 4 +-
> arch/x86/crypto/sha512-ssse3-asm.S | 4 +-
> arch/x86/crypto/twofish-avx-x86_64-asm_64.S | 24 ++++++------
> arch/x86/crypto/twofish-x86_64-asm_64-3way.S | 8 ++--
> arch/x86/crypto/twofish-x86_64-asm_64.S | 8 ++--
> arch/x86/entry/entry_64.S | 10 ++---
> arch/x86/entry/entry_64_compat.S | 8 ++--
> arch/x86/kernel/acpi/wakeup_64.S | 8 ++--
> arch/x86/kernel/head_64.S | 12 +++---
> arch/x86/lib/checksum_32.S | 8 ++--
> arch/x86/lib/clear_page_64.S | 12 +++---
> arch/x86/lib/cmpxchg16b_emu.S | 4 +-
> arch/x86/lib/cmpxchg8b_emu.S | 4 +-
> arch/x86/lib/copy_page_64.S | 4 +-
> arch/x86/lib/copy_user_64.S | 16 ++++----
> arch/x86/lib/csum-copy_64.S | 4 +-
> arch/x86/lib/getuser.S | 16 ++++----
> arch/x86/lib/hweight.S | 8 ++--
> arch/x86/lib/iomap_copy_64.S | 4 +-
> arch/x86/lib/memcpy_64.S | 4 +-
> arch/x86/lib/memmove_64.S | 4 +-
> arch/x86/lib/memset_64.S | 4 +-
> arch/x86/lib/msr-reg.S | 8 ++--
> arch/x86/lib/putuser.S | 16 ++++----
> arch/x86/lib/rwsem.S | 20 +++++-----
> arch/x86/mm/mem_encrypt_boot.S | 8 ++--
> arch/x86/platform/efi/efi_stub_64.S | 4 +-
> arch/x86/platform/efi/efi_thunk_64.S | 4 +-
> arch/x86/power/hibernate_asm_64.S | 8 ++--

For the hibernate changes:

Reviewed-by: Rafael J. Wysocki <rafael.j.wysocki@xxxxxxxxx>