HomeThe Bluewater BlogAuthorsAdministrator

The 2.6.34 release of the mainline Linux kernel now includes support for Bluewater Systems . Support is included for the Ethernet, frame buffer, NAND flash and I2C peripherals, with support for the SPI bus and I2S audio expected to be merged in the next few releases.

Other code which has been committed to the mainline Linux kernel by Bluewater Systems includes:

  • Driver for the SST25L SPI flash
  • Driver for the DS2782 battery gas-gauge
  • Generic GPIO support for the AT91 ARM processors, which is used by our , and the upcoming 9G45 modules.

We have recently been working on a project which involved implementing a Bluetooth stack on an ARM Cortex M3 micro, with only 48kB of memory and 256kB of flash. We chose to use the Light-weight Bluetooth library (lwBT), which is a small, cross platform, Bluetooth library designed for embedded environments.

The lwBT library provides basic functionality for the , L2CAP, SDP and Bluetooth layers. Additionaly, we implemented an OBEX layer with support for pushing files to remote devices and providing an OBEX push profile to allow files to be received from other Bluetooth devices. We also implemented a basic serial port profile (SPP) on top of the RFCOMM layer in lwBT which provides wireless serial communication. This can be used, for example, with the standard Linux BlueZ RFCOMM serial utilities. The entire lwBT stack fits in under 24kB of flash storage and uses between 5 and 10kB of memory depending on configuration options.

The lwBT stack provides a lot of the core Bluetooth functionality, but the interface to the library is very raw. Our Bluetooth API provides a set of simple access functions for common tasks such as sending and receiving data over an RFCOMM link and pushing files to other devices via OBEX. Our Bluetooth library was also designed to be as platform independent as possible. Only a thin platform specific layer must be written in order to move the Bluetooth library to a new device.

The following diagram shows the structure of the Bluetooth stack provided by our library:

Windows CE on Snapper DV

A common use of our (OMAP3530) technology is to display digital video content on standard televisions or computer monitors. It is a fairly trival task to attach a DVI/HDMI transmitter to the RGB output of the OMAP3530 to provide a very versatile digital video solution. The trick is to detect what resolutions and frequencies are supported by the connected television. Luckily there is a standard for this.

The Display Data Channel (DDC) is a protocol that can be used to control a display/monitor via a two wire physical link. DDC is found on most recent computer monitors and is also part of the HDMI cable specification. The most common version of DDC is actually implemented as a standard I2C bus. A DDC compliant display acts as an I2C slave and allows available resolutions and settings to be requested over the bus.

The Extended Display Identification Data (EDID) is a data structure that describes information about the display. It is this structure that can be read over the DDC and is specified by the Video Electronics Standards Association (VESA). The most commonly used version is v1.3 and it is this version that is used by HDMI compliant devices.

Given most embedded devices have I2C busses, it is a trival task to connect to the DDC of an HDMI connector. So once you have a working I2C bus, reading the EDID is as simple as reading 128 bytes at I2C address 0x50. The "parse-edid" application which is part of the "read-edid" opensource package can be used to parse the binary blob that is the EDID to something that is human readable.

Critical to most embedded applications is parsing the supported resolutions. Unfortunately the information in the standard EDID is not always enough to make these decisions. This information is usually stored in "extension" blocks that are stored after the standard 128 byte EDID. Each extension block is of type audio, video, vendor specific or speaker. Each video extension block contains a supported resolution and suitable refresh frequencies.

While connecting off the shelf displays to embedded devices can by a very quick way to add large display capablilties, the reality is that most televisions and monitors are quite specific about the resolutions and refresh rates the need to operate at. Hence to properly support consumer displays, the DDC channel and EDID parsing must be implemented on the embedded device and resolution and pixel clock set accordingly.

mbedMicrocontroller

If you haven't already got one, head over here and pick up an mbed board for a small sum of money. With it you can:

  • Play with a real piece of ARM hardware
  • Write code in C/C++ using a web-based IDE
  • Develop software without installing any tools on your computer
  • See what lots of other people have done with this great little device
  • Have a program running in under a minutes (yes really!)
  • For US$59 it has to be one of the lowest cost ways into ARM
This unit is very similar to the Snapper UDIP module we created a few years ago, but the bug innovation is that you don't need to worry about tools. It connects to your computer over USB and you interact with it via a web site.
Sorry, we don't sell these, but see here for lots of options in various countries.

It is perhaps a little over a year since it became obvious to everyone that Intel and ARM were starting to stamp on each other's toes. For years it was assumed that only x86 could do 'real' computers and only ARM could do battery devices. For some reason it was , a tiny market with limited volume (or even MIDs, which didn't even exist), where all the fuss began.

As the lines between smartphone and PC blur, "ARM is coming up from the portable space, and Intel is coming down from the PC space," says Joseph Byrne, a senior analyst at the Linley Group who specializes in semiconductors and processor IP. "Looking forward, these guys are going to collide."
Talk of convergence between mobile and computer has been going on for a decade, but is not much closer for all that time. A laptop is still a PC, despite the inclusion of a battery and a Nokia E90 is still a phone despite the keyboard. Many assumed that if Intel wanted to win in the mobile space then it would. It just didn't see the point of fighting over a market where the CPU costs $10-20. Those of us of a more technical bent, while not doubting Intel's capability and engineering prowess, wondered how long it would take. Some thought that Intel could play catch-up for 5 years or more, and still not win the battle.

Thin Client

The problem for Intel, as this article describes, is not just ARM but the 'thin client' model. If I can run a 'thin' email client on my ARM-based device, and it can web browse, and show photos and videos, and open the occasional PDF, etc., what more do I need? MS Office? Well maybe, if I'm a business user but otherwise Google Apps might do the trick. On the other hand, the problem for the thin client model is that it has been talked about for even longer than convergence. It pre-dates the tech crash when Sun told anyone who would listen that Windows was doomed and the network is the computer. So people are naturally sceptical. The definition of thin client has broadened quite a bit, now encompassing anything with a decent GUI, not just a very small desktop PC. Still, the name 'netbook' has hung on, now meaning a small laptop, and I am very happy with my (x86-based) Asus Eee PC700 thanks very much. Ken Olson, founder of DEC, ironically later an ARM licensee, supposedly said in the late 1970s “There is no reason for any individual to have a computer in their home.” Now it is clear that the question is not whether, but how many, and what in fact is a computer? By processing power it would be obvious to Olson that the Nintendo DS, Apple iPod Touch, Nokia E71, Archos 700 Internet Tablet and Nintendo Wii are all computers in some sense. By that definition our household has several dozen computers. Every one of these devices has at least one ARM chip. None of them has an x86 processor, let alone an Intel. All are network connected and can be considered as thin clients, albeit very thin in some cases. But each of these devices provides an adequate experience for the user, which is all that really matters. So, whereas Intel is fighting the thin client model, this model can only benefit ARM, since ARM already rules in the thin client space. Intel hopes that thin clients will either not succeed, or will run out of legs and have to move to faster x86 processors. That might be a forlorn hope. In summary, the thin client model suggests that ARM has a bright future as consumers buy more and more of these thin client devices for particular tasks.

Processing Power

The other side of the argument is processing power.Ten years ago you arguably needed an x86 CPU to do just about anything. [I say arguably because there was a time (around 1996) where you could buy a 200MHz ARM-based computer but the Pentiums stopped at 90MHz. But it was the Pentiums which ran Windows] Perhaps seven years ago email would have been beyond an ARM chip. Five years ago basic word processing and spreadsheets were problematic. But today the processing power and associated heat, size and expense that comes with an high-end x86 CPU is really only needed for video editing, Flash-enabled web browsing and perhaps photo editing. The list of things you can't do with an ARM CPU is getting smaller all the time, and the performance requirements of Windows and the common applications have not kept up. Those, such as Intel, who argue that you can only get a decent web browser on an x86 platform should take a look at the lowly Nokia N900. Engadget's review says this:
Almost without fail, sites were rendered faithfully (just as you'd expect them to look in Firefox on your desktop) with fully-functional, usable Flash embeds -- and it's fast. Not only is the initial rendering fast, but scrolling around complex pages (Engadget's always a good example) was effortless; you see the typical grid pattern when you first scroll into a new area, of course, but it fills in with the correct content rapidly. To say we were blown away by the N900's raw browsing power would be an understatement -- in fact, we could realistically see carrying it in addition to another phone for browsing alone, because even in areas where it gives a little ground to the iPhone or Pre in usability, it smacks everyone down in raw power and compatibility.
Steve Jobs, in launching the ARM-based iPad yesterday said the iPad offers the...
best [Web] browsing experience you’ve ever had. A whole Web page right in front of you that you can manipulate with your fingers. Way better than a laptop
That's quite a claim and if it is even halfway true if suggests that ARM it catching up fast. Some would argue that massive processing power is needed just to run Windows 7 - in fact this is suggested by . But in a similar way to Intel, Microsoft's problem is selling people an Operating System which they might not need. In any case, telling consumers that they need 1GHz of CPU power to run the Operating System and virus checker has to be a risky strategy. Part of this change is due to software becoming smarter, but most of it is simply that ARM chips are getting faster. It has all happened rather suddenly. Not much more than a year ago TI brought out a 600 MHz Cortex-A8 chip, with 1200 DMIPS of performance (as used now in the Nokia N900). Atom offered 1.6 GHz and perhaps 3900 DMIPS. No contest, although it could be pointed out that a better comparison in terms of power consumption was 1950 DMIPS for the 600mW Atom (excluding the 2-3W used by the companion chip). Then in September, ARM announced their 2GHz Osprey development. Aimed at around 2W, this claimed to offer around 10,000 DMIPS, twice that of the fast Atom but at less power since no companion chip is required. Still it was only a design, not in real chips, so Intel carried on with its original iPhone ARM11 comparisons. After all, Intel has great plans for the future also. But in the past few months we have a dual core 1 GHz Cortex-A9 from nVidia, Qualcomm's 1.5 GHz dual core Snapdragon (ARM compatible) and of course TI's next step along the path, the OMAP4440. So it's not just designs: we now have chips. Still, chips doesn't equal final product, and the iPhone 3GS and Nokia N900 are using only a now-lowly 600 MHz Cortex-A8. So Intel can be safe for another few years, surely? Initial ARM-based netbooks were cheap but not necessary stellar on the performance side. But now Acer has announced that it is looking at tablets, and hinted that this might involve ARM. Freescale is talking ARM tablets, and the Apple iPad includes an ARM core (perhaps a single core Cortex-A9 with Mali graphics). Gartner group thinks that Android on ARM is more snappy than Windows 7 on Atom. Suddenly, ABI Research is predicting that ARM PCs will outsell Intel in 2013. The performance arguement for x86 might be wearing thin. Perhaps all that junk silicon really isn't for anyone's benefit. Consumers appear to be moving fast to more portable devices - laptop sales are increasing whereas desktop PC sales are actually in decline year on year. This could explain the meaning of the iPad - an attempt to capture 80% of the 'computer time' of consumers with a new device which does most of what they need for less money and less hassle. In summary, ARM appears to be overtaking Intel's Atom on speed (this is unlikely to last!), while Intel is really struggling with the battery life. I know where I'd rather be in this race.

And the Winner Is?

Things are going to be very interesting over the next few years. Both ARM and Intel really have their work cut out for them to grow their market share from their respective home bases. Of course it is too early to declare that ARM is going to take over the market for low-end computers. Perhaps it doesn't matter anyway, given that in many cases the Bluetooth, WiFi, flash media and graphics components already include an ARM. Some would argue that the takeover happened long ago. But in the headline CPU, where Intel is Inside, my suggestion is: sell Intel, buy ARM.