Is Developing for ARM More Difficult Than for Other Architectures?

DZone 's Guide to

Is Developing for ARM More Difficult Than for Other Architectures?

Sure, it's difficult to learn how to develop for ARM, but once you learn ARM architecture, you can apply that knowledge in many other places.

· IoT Zone ·
Free Resource

I believe in life-long learning. With this, I continue to learn and discover new things every day. I’m writing tutorials to give something back to the community from which I have learned so much.

On top of this, I receive emails on a nearly daily basis asking for help. Many articles have the origin in such requests or questions. I prefer questions or comments in a public forum, because that way, I feel all others can benefit from it. Last week, Alessandro contacted me with this:

Hi Erich,

I hope this find you well! I’m starting to using ARM processors, but I find them quite complicated on the configuration side. I started in the past with PIC micro (PIC16) with asm, and I found them quite straightforward to be configured (clock, IO, peripherals, …). Then I moved myself on C language, and on PIC18 without any big issues.

Now I would really like join the ARM community, I see that these processors are what I’ve always looking for, on energy, calc power, peripherals, and FINALLY on IDE (editor, toolchain and utilities)… AMAZING!!!

The topic is about how to start learning to develop for ARM. Alessandro agreed to make this public, so I thought this might be a good topic for an article.


Long Time Ago…

I started assembly programming on a MOS Technology 6510, Motorola 68000 and then wrote my first C compiler for the Hitachi H8/500 and Motorola HC11... and many other architectures followed. And like Alessandro, I have been pulled into the amazing and powerful world of more advanced processors as they came out over time. I have been pulled into the fascinating world of IDEs like Eclipse. I had the pleasure to work on many microcontroller and processors, but in retrospect, these early processors and microcontroller are my favorite. They were rather easy to configure and use compared to the modern ARM Cortex microcontroller. And that’s the point of Alessandro, too:

Here my point: An ARM Cortex-M3/M4 microcontroller has a datasheet that is roughly 1000 pages long.

I tried to understand from this and the example code, “shipped” with the board, the “configuration phase”, but unfortunately often there is a lack of information that can’t make me understand exactly why the manufacturer wrote that line of code, instead of this…

Many things are taken for granted… Nobody taught me on that ;(

So, it is not only me thinking that there is a problem. Maybe all the ones who grow up like me with technology take things for granted. As as the older ones know a lot of the legacy, it is easy to forget that not everything is obvious. It is easy to say "I know too much" and to forget about how much experience and knowledge it takes to understand and use these modern architectures. As for myself, I have realized that my early tutorials were much more basic than the most recent ones. Am I getting too advanced?

As for the 1,000-page long data sheet, alone, the ARM Cortex-M information is several thousands of pages. ARM microcontroller vendors like STM, TI, NXP, etc. are integrating the ARM IP (core) and adding their own special peripherals, so why are we explaining how the ARM IP is interlocked? Go look at the ARM website! Not everyone knows how the NVIC (see my articles series) works, but it is essential to use the device. It makes sense that this information is not copied over and over, but for someone starting with the device, it is not clear and a difficult start. For example, the NXP K64 Sub-Family Reference Manual has 1,789 pages and does not include anything about the ARM Cortex itself — it is only about the peripherals (UART, SPI, etc.), and all the other important things are behind a yellow box with some references to NVIC, and more:

Core Configuration

Core Configuration (Source: NXP Kinetis K64F Reference Manual).

Definitely not an easy starting point for a newcomer.


Of course, the number of pages does not say anything about the quality of the documentation. There are interesting comments on that subject in a Hackaday article. One of the comments references the M68HC11 reference manual. The reference manual has about 650 very well-written pages. It includes everything I need to know about the device. Plus, there was this small programming reference guide with package information, instructions encoding and all the peripheral settings (ADC, SCI, Ports, etc.):

MC68HC11 Programmers Reference Manual

MC68HC11 Programmers Reference Manual.

In contrast to this, below the ARM "Bible" written by Joseph Yiu: more than 800 (excellent!) pages. But only about the core itself:

Bible by Joseph Yiu

Bible by Joseph Yiu.

Should I Learn Cortex?

Yes, the Cortex-M (ARMv7) has a different complexity than the HC11, and the ARM Cortex-A would be even more complex! So, maybe the ARM Cortex-M is even the right starting point to learn a microcontroller?

We are having a similar debate at the university. The university entry microcontroller course is using the Motorola/Freescale-now-NXP-soon-maybe-Qualcomm HCS08 core. It is a rather simple 8/16 bit microcontroller, it's easy to understand, there are no pipelines/caches, and it has straight-forward peripherals — a perfect device to learn a microcontroller. While that S08 is produced in high volume, it is a proprietary core and more of a thing of the past. Most vendors have now moved on to produce ARM-based devices. Very likely, students will end up using an ARM in their workplace or in their projects. So, using the HCS08 from a learning and didactic point of view makes a lot of sense. But it does not from a marketplace perspective.

MC CAR Microcontroller Learning System with HCS08

MC CAR Microcontroller Learning System with HCS08.

I Want to Understand It

Alessandro makes more good points:

“In particular, I would be really REALLY glad to understand how to configure (using just code, not others graphical tools -> I want to understand it!):

* The microcontroller clock for CPU, peripherals and others uses… starting from the datasheet pages to the code.

* The I/O for general purpose, and peripherals…. starting from the datasheet pages to the code.

As soon as I understand I will be happy to switch to the plenty of the graphical tools available to do that job!

I know that you are very very busy, but I assume that out there are many people like me, that still have the same problem to switch to ARM microcontroller platform, and we all would be very Very VERY VERY GRATEFUL if you could write a great (as always you do) article on these topic.”

I absolutely agree. I have to learn and understand the basics before going to the high-level tools or something like a graphical pin configuration tool:

PTB1 Muxed with Pins Tool

Muxing with a Pins Tool.

This is not necessarily only true for graphical configuration tools (which are a great help!). It's the same for using things like a compiler — without having an understanding of how a microcontroller works and how it uses assembly instructions to perform a calculation, using a high-level language compiler is like flying in the dark. Clearly, it does not mean that we have to go back and start doing everything in assembly language, but a good developer should have a good idea what gets executed as a result of his source code compiled and running on the target.

Same thing for all the graphical configuration tools like Processor Expert, STM32Cube, or all the configuration and muxing tools. I have to know how things are supposed to work behind the scenes to make good design decisions. And the tools are here to help me so that I don’t have to memorize all the thousands of pages of the data sheets and reference manuals. It needs both: the helper tools and a solid reference manual.

However, how the clocking or the peripherals like UART, USB, SPI, or I2C are working, is very vendor-specific. There are common concepts like clock gating, prescalers, buffering, interrupts, latching, latency, and configuration registers, but the implementation details are vendor- and even device-specific. That means in the extreme case, these kinds of things have to be re-learned for every new device. On the plus side, what I have learned about the ARM Cortex architecture can be re-used.


What about all the examples from the microcontroller vendors? They can be a good starting point. But I find many examples are not well documented, and if there are comments in the sources, they tell me the ‘what’ and not the ‘why’. Or the examples are not doing what I believe they should do: I mean how much sense makes a ‘blinky’ example which does not use the LED on the board?

In the past I have seen good application notes, but then it is challenging to find the source code for it. Or the sources/projects are outdated as the tools used do not exist any more or have been replaced by other and incompatible tools.


I wish there would be more books (see Books section). I like to have a paper version of good information. I like to make notes on the pages or add sticky notes. But maybe I’m alone: I only see few students printing the lab instructions or the scriptum (why?). Publishing books is risky and expensive, although there are new ‘on demand’ models which sound interesting. Still a fundamental problem would be: to write a good book about developing for a processor or board, it gets very fast very board/processor/microcontroller specific. So if it is not for a very popular board (say the Raspberry Pi), probably the market is too fragmented and not worth the effort.


A great starting point is to use one of the many (and inexpensive) evaluation boards:

In the past I bought the LPCexpresso from embedded artist with the LPC1343, and tried to start with this… But as soon as I searched something on the web about configuration I quickly realized that ST32 is more popular, then probably I should have bought that!!! . Personally I would be quite happy to use the NXP..

Here there is another challenge: there are so many of these boards that only a few really get widely used and supported. The silicon vendors are producing boards for nearly every device they produce for evaluation but not for development. Too many, and too different so only a few boards get critical mass to be used as a community board. The exception to this are for example the Arduino boards which are mostly compatible. The best examples for me are the Raspberry boards. The Raspberry foundation has put a lot of efforts to keep things compatible and to ensure that software runs on all boards — something I don’t see on other boards. And with this, it is much easier to learn and find things for a Raspberry Pi than say for an LPC1343.

FRDM-KL25Z connected to Raspberry Pi with openHAB

FRDM-KL25Z connected to Raspberry Pi with openHAB.

Software and Tools

Another aspect of using an architecture is the availability and quality of software development tools:

One another BIG problem that I had was about the “toolchain”.

When I bought the LPCexpresso it was “shipped” with the CodeRed compiler . Onestly for me has been a big mess understand that there are 3 main compilers: ARM, IAR and GCC. Now it is foregone conclusion, but when I started it wasn’t… I didn’t want be stuck (learn) one compiler which could have been under a payment option.

Second problem was about a good IDE. I wanted to start with the best and scalabile option, but at that time, and probably even now I’m not really aware about all the pros and cons that one has vs another…

So, the good thing in the ARM world is that there are plenty of choices. The bad thing in the ARM world is that there are plenty of changing choices. Several vendors including ARM Inc. are providing multiple (competing) tool chains and IDEs. And with all the mergers and acquisitions things are constantly changing too. The only good thing on the IDE side is that many tool vendors have standardized on Eclipse as an IDE, so once I have learned Eclipse, it is very well-invested, as I can reuse my knowledge for that IDE from that other vendors — until a new standard emerges.


Is it more difficult to develop for ARM devices? I think that depends. Personally, I very much like the ARM architecture. But in my opinion, it is easier and simpler to develop say for Intel. I don’t like the Microsoft+Intel combo, but they seem to do a better job to help the developers with good software and tools. ARM Inc. tries to catch up on this with mbed/CMSIS and other initiatives, but it is still fragmented and inconsistent. I think the strength of the broad ARM ecosystem is its biggest weakness, resulting in fragmentation, inconsistency, and constant changes.

On the other side, because so many vendors are using an ARM architecture in their devices, once I have learned architecture, it is easier for the next device. I still have to learn the different peripherals, but at least the core remains (mostly) the same. Similar to Eclipse as IDE, most vendors are using it, and if I have learned it once, that learning time is well invested.

Silicon is getting more and more complex, as are the software and tools. To me, hardware and silicon are important, but even more important are the software and tools.

What counts at the end is how fast I can finish my project/product with unique functionality. And if I can reduce my learning time, the better. With all the mergers and acquisitions going on, hard to say where we end up in 5-10 years. I feel the success of an architecture depends on how easy it is to get new users like Alessandro up to speed.

I know that the topic discussed here is controversial, and up to different options. But I would like to thank Alessandro for bringing up his thoughts.

architecture, arm, iot, iot development

Published at DZone with permission of Erich Styger , DZone MVB. See the original article here.

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}