core: ofnode: Have ofnode_read_u32_default return a u32
It was returning an int, which doesn't work if the u32 it is reading,
or the default value, will overflow a signed int.
While it could be made to work, when using a C standard/compiler where
casting negative signed values to unsigned has a defined behavior,
combined with careful casting, it seems obvious one is meant to use
ofnode_read_s32_default() with signed values.
Cc: Simon Glass <sjg@chromium.org> Signed-off-by: Trent Piepho <tpiepho@impinj.com>