On 12/11/2023 11:56 PM, David Hildenbrand wrote:
Let's mimic what we did with folio_add_file_rmap_*() so we can similarlyReviewed-by: Yin Fengwei <fengwei.yin@xxxxxxxxx>
replace page_add_anon_rmap() next.
Make the compiler always special-case on the granularity by using
__always_inline.
Note that the new functions ignore the RMAP_COMPOUND flag, which we will
remove as soon as page_add_anon_rmap() is gone.
Signed-off-by: David Hildenbrand <david@xxxxxxxxxx>
With a small question below.
+ if (flags & RMAP_EXCLUSIVE) {This change will iterate all pages for PMD case. The original behavior
+ switch (mode) {
+ case RMAP_MODE_PTE:
+ for (i = 0; i < nr_pages; i++)
+ SetPageAnonExclusive(page + i);
+ break;
+ case RMAP_MODE_PMD:
+ SetPageAnonExclusive(page);
+ break;
+ }
+ }
+ for (i = 0; i < nr_pages; i++) {
+ struct page *cur_page = page + i;
+
+ /* While PTE-mapping a THP we have a PMD and a PTE mapping. */
+ VM_WARN_ON_FOLIO((atomic_read(&cur_page->_mapcount) > 0 ||
+ (folio_test_large(folio) &&
+ folio_entire_mapcount(folio) > 1)) &&
+ PageAnonExclusive(cur_page), folio);
+ }
didn't check all pages. Is this change by purpose? Thanks.