Hi,
preface:
Im a uni student working on a rust kernel for the Rpi Zero (the armv6 32 bit one). Rust doesnt have a built in target for armv6 bare metal (nightly 1.61.0) so i threw together a custom target based on the C code i had written a few quarters ago for an embedded OS class. I will admit, i am not very well versed in linker scripts, and am still semi new to rust (Ive pushed code upstream to projects like Tock for the uart implementations before, but this is my first bare bones project of my own). There is NO code on this pi that i did not write or get from an open space like dwelch (thanks!). I should note, i am aware Rust uses UTF8 for strings and &str. I am also aware if i know they will all by ascii bytes, i can use the &str.as_bytes(). However, this does not work and gives me bad byte data.
What works:
GPIO: i can blink leds, and set alternate functions for other hardware (uart needs this)
UART: I can send data back and forth from the pi to my laptop (m1 mac). I used the same protocol implementation we made in my embedded OS class since its dead simple. I am pretty positive its correct based on my annotations of the broadcom docs and errata, and the fact that my C implementation was cross checked with 35 other peoples, and none of our implementations broke after. This was important since it was the implementation used by our pi for bootloading programs from our laptop.
What doesnt work:
Sending string bytes over uart. For some reason, if i manually create a [u8; 6] for "hello\n" and send it over the uart to my unix listener, as so
let hello: [u8; 6] = [104, 101, 108, 108, 111, 16];
for i in 0..6 {
nox::boot_put32(hello[i] as u32);
}
then my unix side listener will correctly get and parse the bytes as their ASCII characters. However, if i create a print function as so
pub fn println(msg: &str) {
let len = msg.len();
let bytes = msg.as_bytes();
for i in 0..len {
unsafe {
nox::boot_put32(bytes[i] as u32);
}
}
}
Where i call it as println("hello"); in my main, i get bad bytes on my unix listener. Specifically, i get garbage values (actual values are in the [dump files/ directory) whereas i get the correct values for the first example, seen right above them in the dump file.
No matter what way i attempt to write this, i end up with the same behavior. I can successfully send the correct first byte, but never anything after. If i try to send anything other than the first byte, it sends garbage (i.e. not even the first byte is correct now). As an example, if i rewrite my println() to take in a byte slice (&[u8]), and call it like
println(b"p");
i receive the correct byte on my unix listener. However if i change the call to this:
println(b"pe");
i then get garbage on my unix listener (specifically, i should get 0x70 0x64, but i get 0x0 0x30).
I cannot figure out why, but i SUSPECT it has to do with my linking process and rodata.
I have come to this conclusion since earlier in my testing i was actually never receiving garbage, but only 0x00 from println(). This was odd because why would the bytes of a &str be all 0? So i did some digging online and found some mentions on here of &str being all null, and how it was a problem in rodata. I used objdump to inspect the sections of my binary and I read some more on linker scripts, and modified my linker script to its current form, and now it appears to properly use rodata? I can at least see my string in rodata if i declare it as so:
let s = "pe";
The actual dump of rodata is in my rodata.txt file here
Any ideas as to what the problem could be? I tried aligning my rodata by 8 and by 4 to no avail.
I have also tried reading from the raw &str pointer with .read_unaligned() with no success (except on the first character again...weirdddd).
Any help or suggestions would be greatly appreciated. I will be working closely on this for a while.