On Tue, 7 Jun 2022, Jiri Slaby wrote:
It is preferred to use sizeof(*pointer) instead of sizeof(type). First,
the type of the variable can change and one needs not change the former
(unlike the latter). Second, the latter is error-prone due to (u16),
(u16 *), and (u16 **) mixture here.
Signed-off-by: Jiri Slaby <jslaby@xxxxxxx>
Reviewed-by: Ilpo Järvinen <ilpo.jarvinen@xxxxxxxxxxxxxxx>
This seems fine but see the comments below which are not directly related
to the change itself.
---
drivers/tty/vt/consolemap.c | 23 ++++++++++++-----------
1 file changed, 12 insertions(+), 11 deletions(-)
diff --git a/drivers/tty/vt/consolemap.c b/drivers/tty/vt/consolemap.c
index 097ab7d01f8b..79a62dcca046 100644
--- a/drivers/tty/vt/consolemap.c
+++ b/drivers/tty/vt/consolemap.c
@@ -251,12 +251,12 @@ static void set_inverse_trans_unicode(struct vc_data *conp,
return;
q = p->inverse_trans_unicode;
if (!q) {
- q = p->inverse_trans_unicode =
- kmalloc_array(MAX_GLYPH, sizeof(u16), GFP_KERNEL);
+ q = p->inverse_trans_unicode = kmalloc_array(MAX_GLYPH,
+ sizeof(*q), GFP_KERNEL);
if (!q)
return;
}
- memset(q, 0, MAX_GLYPH * sizeof(u16));
+ memset(q, 0, MAX_GLYPH * sizeof(*q));
Convert kmalloc_array into kcalloc and place memset() into else branch?
@@ -514,11 +514,12 @@ con_insert_unipair(struct uni_pagedict *p, u_short unicode, u_short fontpos)
n = UNI_ROW(unicode);
p2 = p1[n];
if (!p2) {
- p2 = p1[n] = kmalloc_array(UNI_ROW_GLYPHS, sizeof(u16), GFP_KERNEL);
+ p2 = p1[n] = kmalloc_array(UNI_ROW_GLYPHS, sizeof(*p2),
+ GFP_KERNEL);
if (!p2)
return -ENOMEM;
/* No glyphs for the characters (yet) */
- memset(p2, 0xff, UNI_ROW_GLYPHS * sizeof(u16));
+ memset(p2, 0xff, UNI_ROW_GLYPHS * sizeof(*p2));
This could have been kcalloc'ed.