Does anyone have pointers on where to start with actual embedded programming? I have a couple Arduinos and RPis laying around, but I'm wondering if there are more 'real' ways to do it.
I have had a lot of fun following Ben Eater's[1] projects, which aren't always embedded-specific (sometimes they're TTL, sometimes Arduino) but are excellent for understanding concepts deeply.
I tend to learn best with a specific project that can grow or morph as my interest or experience dictates. You might find something to build with an Arduino, using the toolchain/IDE/libraries, get it working and then start stripping out libraries for your own implementations, or getting a toolchain of your own to cross-compile and flash.
[0]: http://www.edx.org/course/embedded-systems-shape-the-world-m...
[1]: https://eater.net/
How could one get the safety promises that are observed in Rust in C?
Thanks!
I actually have built the clock module from Ben Eater with the intent of building the 6502 computer project at some point in the future. I really like his stuff.
* Michael Pont's Embedded C and Patterns for Time-Triggered Embedded Systems (PTTES). They are chock full of invaluable C code (for 8051); in particular, beg, borrow or steal PTTES (free pdf available). Also checkout his other books and his company SafeTTy Systems.
* Make: AVR programming by Elliot Williams teaches you to directly program the ATmega328P on a Arduino Uno.
* Introduction to Embedded Systems: Using Microcontrollers and the MSP430 by Jimenez, Palomera et al. is an excellent textbook explaining each hardware aspect of an embedded system and how to program them.
Note: All the above are for bare-metal embedded programming. For Embedded Linux on RPi, i suggest Exploring Raspberry Pi: Interfacing to the Real World with Embedded Linux by Derek Molloy.
For this part, it's also fun to have a logic analyzer (starts at about 10 bucks) to see the change in code manifest on physical pins. It's also helpful to see that what you think you are doing, is actually happening. Eg, the SPI Chip select pin may be inverted (high when should be low and vice versa).
Then start off with a simple program that does init, and periodically read the sensor. Perhaps adding thresholds that trigger eg a LED. Then you can extend this to pipe over serial port to the rpi and push it to some server of your choice (eq mqtt), or display on a local webserver dashboard.
Go with sensors that are ordinary SPI or I2C, not some one-wire-protocol. Suggestions, BMP180 (temp, RHum), TSL2561 (light).
Have fun!
edit: if you are doing it on the Arduino, you can start off with the arduino spi/i2c libs, and later on if you wish, fire up the AVR datasheet (or whatever cpu is on your arduino) and implement i2c/spi yourself by changing registers etc on the cpu.
I'd recommend getting a dev kit like the STM32F4DISCOVERY (https://www.st.com/en/evaluation-tools/stm32f4discovery.html). ST Micro's boards are often used for courses (https://www.udemy.com/course/cortex-m/) so you may like to take some of those courses. You'll often hear about the TI MSP430 as another microcontroller but AFAIK it's beginning to be a bit dated. Although come to think of it, there's probably more educational material out there for it, if you're willing to search.
Grab a kit like the Sparkfun Beginner's Kit (https://www.sparkfun.com/products/13973) and read some of the tutorials on their website about creating circuits. Tutorials or courses for your dev kit should get you to a point where you can light an LED controlled by the micro.
From there, you may like to do more advanced stuff like communicating with sensors over specific protocols (Sparkfun's Tinker Kit and associated guides may be of use https://www.sparkfun.com/products/14556 though you will have to translate from Arduino to C code, which can be good practice for knowing how Arduino works under-the-hood).
At this point, you'll probably know whether you want to keep learning more about sensors/lights/IoT type stuff, or want to branch out to other embedded-related topics. More advanced IoT material will be things like taking sensor measurements, storing measurements to memory, interfacing with displays, sending data via WiFi or Bluetooth.
Edit: I skimmed over a lot to keep it short. There's a lot hiding behind how casually these recommendations are made, so feel free to reach out with any questions (email in profile).
I'm a Python/Julia developer starting to learn C. I have K&R already, and Test Driven Development for Embedded C (Grenning).
I did order 'Modern C' by Gustedt but the publisher never delivered to Waterstones so they had to cancel the order (about 6 months ago, book still unavailable from Waterstones as of today).
Contents are:
* How to identify and handle undefined behavior in a C program * The range and representations of integers and floating-point values * How integer promotions are performed and how they may affect portability
I think it's incredibly important to understand how numbers on computers work, what are the limits of 32bit and 64bit values and how double/floats play into it.
* The order of evaluation of expressions
Most coding styles avoid ambiguity by just using (enough parentheses) around (important statements).
* Dynamic memory allocation including the use of non-standard functions
Non-standard worries me here. Memory allocation isn't particularly
* The philosophy underlying the use of character types in C Character encodings and types
I'm curious as to what this is covering? Is it system specific?
* How to perform input/output (I/O) with terminals and file systems using C Standard Streams and POSIX file descriptors * The translation phases implemented by the C compiler and the role of the preprocessor * How to test, debug, and analyze C programs
IO is an interesting topic, but I suspect best covered by a systems programming manual. There's several unix books, Stevens being the goto guide historically, and I would go straight to the source instead.
>"You'll learn how to develop correct, portable, professional-quality code and build a foundation for developing security-critical and safety-critical systems."
So the stated aim of the book is to build a "foundation" from which you could then go on to digest and effectively use advanced things like the Stevens book.
I learned from K&R, but highly recommend Seacord's books if you're looking for how to write secure C and a more modern take on some of the trickier parts of C.
K&R is a good reference, but not good for learning the language IMO.
1: I love Plum's books (starting with learning to program in C), but can't recommend them since the language has changed so much in the 37 years since it came out.
Looking at the current language/job markets outside the center, I feel like we are hitting the same problems as in open source. People add C++ to every C job to have something with the same level of innovations going on as new languages, even if it is about Linux embedded and you wouldn't let a C++ construct near the system.
I guess all these AVR, STM32 and ESP2866 devices running C++ are to be discarded too ?
But after initial turbulence, life has gotten much simpler (and dare I say quiet) for our HR since we moved to using the stock job profiles shipped with the platforms we buy.
We'll also be pushing out a promotional offer on this book, likely this coming week so you may want to wait for that. Just trying to keep things on track now that our company is completely remote and physical book stores are closing left and right.
I ask because there's 3 or 4 titles on your "coming soon" page that I'm very interested in but I have limited funds for a purchase. Cheers.
Okay, I realize "profusion" might seem a bit overblown, but honestly, in C world (and in comparison to other languages), this is practically a publishing boom.
Not that I'm complaining; C was my first programming language (back some time in the mid-90s), and it's still my favorite. But I wonder why we're suddenly getting new books on it? The language itself hasn't undergone any substantial changes recently, and if anything, "memory safety" is all the rage -- a thing that C most assuredly is not.
If this can get the non-C programmers to learn and understand the usefulness of a simple, minimal and direct language ("modern" languages are just too bloated), it is well worth it.
I have said it before and say it again; C will allow you to program everything from itty-bitty MCUs to honking server machines. It is also the de-facto universal "glue" language and its real-world benefits far outweigh any perceived difficulties.
Bottomline - Every programmer should know C whether you use/like it or not.
It's got interactivity (bundled compiler/interpreter in one system), a large number of implementations (possibly more supported CPUs than C++, but that might be more due to its age), and a Forth system is easier to "pick apart" and learn how its compiler's implemented than a C compiler.
E: Also, C can be run anywhere. ANYWHERE. It can run on home computers and consoles from the 80's! I like that fact.
21st century C is 2014(?) and Understanding and Using C Pointers is from 2013, so it is not exactly sudden profusion.
Edit: Using C Pointers is from 2013
had that happened a while ago, we would never have seen php et al.
There is absolutely no reason why a website should be build with c, sry...
Edit: link to rasmus lerdorf on 25 years php: https://m.youtube.com/watch?v=wCZ5TJCBWMg
Rather
ZetZ -- Symbolic Verifier and Transpiler to C. https://github.com/aep/zz and previous discussions, https://news.ycombinator.com/item?id=22245409
Zig, https://ziglang.org/ https://hn.algolia.com/?q=ziglang
Rust is more of a C++ competitor, https://www.rust-lang.org/
And Dlang has a regime where it can be used w/o a GC. https://dlang.org/spec/garbage.html
I left out any language which forces the use of a garbage collector.
Tried twice, couldn't solve it.
If you're trying to sell something, make it easy to get my money.
(And it doesn't matter if it's a physical book or e-book, the cost of printing nowadays is ca. $1 for each 100 pages.)
When our authors ask how long their book should be I always say: Long enough to cover the subject, short enough to keep it interesting. My company is called No Starch Press for a reason. Think of the word starch as a nicer way to say "BS", as in No BS Press.
There's a lot of work behind these pages as with all of our books. Unlike any publisher in this field we have several people who read and craft every line of every book as necessary, together with each author, before a book goes off to a copy editor. That's where most books start but not hours.
The real cost in creating a good book is not in the paper. It's in the time it takes to actually craft the words.
This is what has annoyed me about pulp paperback fiction. They skyrocketed from $5 to $15 in 15 years. Then you can buy the eBook for $12 and feel like you're getting a "deal" on a product with near zero production costs.
Amazon may ultimately discount this from the list price as they usually do but they are currently deprioritizing books due to the pandemic.
When I see that Brian W. Kernighan releases a book, it's an instant buy for me.
So until a preview chapter or similar is available I’ll through this link to Modern C, 2nd Ed by Gustedt. It has been well received and I thought Jens writing style was solid.
I don’t know if it is the perfect second book to read on C, but it seems well paced and things are well explained.
NoStarch never let me down so far, and paying USD 60 (or 40, or whatever) is such a marginal difference for a book you're going to spends dozens of hours on.
maybe older books are cheaper
It's a luxury item. For comparison K&R2 sells for $51.99 new on amazon.
I think a fair price would be $49.99.