Friday, 12 August 2016

i.MX6SX - Realtime CAN interfacing Part 3 (UDOO Neo)





In the last post we covered how the RX8 instrument cluster could be controlled from the A9 side. Given the A9 side is feature rich in its capabilities if offers numerous possibilities. As a simple demonstration we will attempt to replicate the RPM and Speed gauges and two warning indicators on the hdmi (720p) display to represent a simplified virtual instrument cluster. The main challenges here are:

1. On screen widget need high performance.
2. Keeping the on screen gauges/warning indicators in sync with the instrument cluster.
3. Minimising CPU usage.



In the video, the on screen RPM gauge (on left fairly) accurately tracks the RPM needle on the instrument cluster. The on screen Speed gauge (on the right tracks) the digital Speed indicator. We also toggle the battery and oil warning indicators on the screen and cluster. Notice that the cluster_gauges application is consuming rough 10% of the CPU.

In order to deliver the necessary performance the on screen gauges were rendered using custom Open GL ES 2.0 code. Note the graphic images used for gauges are modified versions derived from this Android Ardunio project. Compared to the previous post there is now a single application (cluster_gauges) on the A9 side which renders the widgets but also controls the instrument cluster through the M4.




I hope through these 3 blog posts have I managed to demonstrate how the M4 and A9 processor can be combined to provide a rich real-time interface and data distribution mechanism for your applications.



Thursday, 11 August 2016

i.MX6SX - Realtime CAN interfacing Part 2 (UDOO Neo)

As mentioned in my previous post the next stage of development with the RX8 instrument cluster is to offer the ability to control the gauges/indicators on the cluster from the A9 side. In the last post I also pointed out that to keep the cluster active the M4 side has the responsibility of regularly sending/receiving CAN messages from/to the cluster. Hence we can't use the CAN interface from the A9, instead we get the A9 'to talk to' the M4. Within the i.mx6sx the Messaging Unit module enables the two processors to communicate with each other by coordinated passing message. We can make use of this feature to send messages to the M4 from A9 to ask it to update the cluster and use the M4 to forward CAN messages received from the cluster to the A9. By the way, this all happens while the M4 also is feeding the cluster with its own CAN messages. For the i.mx6sx the inter-processor communication is abstracted and implemented through the Remote Procedure Messaging (RPMSG) framework which is supported both in the Linux and FreeRTOS BSP.



Compared to the last demonstration, the main change to the configuration is the introduction of a relay board so that the instrument can be powered on and off (the Power Control box in the diagram) by the M4.


The M4 firmware has been amended to accept commands from the A9 via RPMSG. On the A9 side we have 2 applications:

1. read_can - Which constantly reads CAN messages sent from the instrument cluster via the M4 .
2. cluster_control - Controls the instrument cluster via the M4.

cluster_control has the following capabilities through a number of command line options:

-i  - turns the instrument on/off
-r  - set the RPM
-s  - set the SPEED (in kilometres although display is in mph)
-b  - set Battery Warning indicator
-o  - set Oil Warning indicator
-c  - set Cruise control indicator


The short video demonstrates two applications running on the A9, read_can is running in the window on the left and on the right cluster_control is used to manipulate the cluster. Note that prior the 2 applications being launched, the M4 was preloaded with new firmware and RPMSG initialised on the A9 side. Once cluster_control changes the cluster state, the M4 is responsible for continuing updating the instrument cluster with the new state until the next change. 

Now that we have the ability to control the cluster and receive messages from the A9 side, in the final post I will demonstrate how another feature of i.mx6sx can be used with it.

Wednesday, 3 August 2016

i.MX6SX - Realtime CAN interfacing Part 1 (UDOO Neo)

One of the most promising capabilities of the i.mx6sx is combing the Cortex M4 with the on board CAN interfaces. This feature opens up a number of possibilities for deploying the i.mx6sx for a varied range of automotive solutions where real-time processing of CAN messages is handled by the M4 and the A9 kept for non-critical processing. To demonstrate this, lets take a simple use case of driving an automotive instrument cluster.  In a series of blog posts I will cover how we can drive the instrument cluster initially from the M4, then from the A9 communicating via MU (Messaging Unit) to M4 and finally by a on screen virtual dials mimicking the actual instrument cluster dials.



For this exercise we used an instrument cluster pulled from a Mazda RX8. The RX8 instrument cluster interface is partially analogue and partially digital. The digital interface is provided through a high speed CAN (500 Kbps) interface that drives the dials (RPM, Oil, Temperature) , digital speedometer and a number of warning indicators. Time pending I may cover the RX8 instrument cluster CAN interface in more detail in another blog. The target board for development was the UDOO NEO complemented with SN65HVD230 CAN Transceivers.

For this first demonstration a FreeRTOS application was developed to run on the M4 that regularly sends CAN messages to the cluster and dumped CAN messages generated on the bus from the cluster to the serial port. In order to drive the dials and suppress the warning indicators, the instrument cluster requires a regular feed of CAN messages sent at 10 milliseconds intervals. When the cluster is active it also generates CAN messages roughly at 50 millisecond intervals.

The video above shows the application (rx8_can) being launched from Uboot, before launching the application notice that all warning indicators lights are active on the instrument cluster. After the application is launched all warning indicators are cleared and the RPM gauge and speedometer are active. The application also dumps the instrument cluster generated CAN messages to the serial port (shown in the terminal to the right). After launching the application we boot to a Linux console to demonstrate the A9 is unaffected while the M4 continues to control the instrument cluster.

In the next post I aim to demonstrate how we can utilise the Message Unit to allow the A9 to control the instrument cluster by communicating with the M4.

Sunday, 28 February 2016

i.MX6SX - Video processing via VADC & PXP (UDOO Neo)

Unlike the rest of the i.mx6 family, the i.mx6sx has limited hardware video decoding support for example there is no h.264 or mpeg support. The i.mx6sx hosts a simpler video analogue to digital converter (VADC) which allows the conversion of a NTSC or PAL signal to a YUV444 format. The YUV444 is slightly unusual in that its encoded as a 32 bit value which equates to YUV32 in V4L2 terms. Given the unusual YUV32 output decoding becomes more interesting because it heavily relies on the PXP (Pixel Pipeline) engine for colour space conversion and onward rendering to a display. The PXP can be considered a smaller brother of the IPU (Image Processing Unit) as found on the rest of i.mx6 family. In addition to colour space conversion the PXP has the capability to flip (horizontal/vertically), scale and blend/overlay images from multiple sources. The one caveat is that PXP on the i.mx6sx has limited input and output formats that it can accept.



Engine check sensor image
To give you an idea of the what the PXP can do, the video above shows the output of a NTSC camera blended with a graphics overlay. The camera is actually a parking (reversing) camera (courtesy of the UDOO team), this is highlighted by the red/yellow/green lines and the blinking 'STOP' text which form part of the camera output. Given the automotive theme the graphics overlay represents a engine check sensor dash. The output video (720x480) fits nicely on the UDOO 7" LVDS display. As the UDOO Neo is targeted towards sensor technology (IOT),  we use the tilt angle from the gyroscope to determine the amount of alpha blending to apply (the graphics fade away or become fully visible).  Furthermore depending on axis of the tilt the combined video output is flipped horizontally or vertically (in real time) noticeable by the 'STOP' text and graphics appearing reversed or upside down.

Given the above, I have put together a simple example that demonstrates to how read the camera output and pass it to the PXP for onward processing. The example is partially based on existing freescale code snippets but modified to use the PXP and then simply renders to a frame buffer. The example will only run against a frame buffer and not on X11. Unlike the above video, the example does not use GPU hardware acceleration therefore the CPU usage will be high. In order to build the example you require PXP header/library to be built and installed which are part of the imx-lib package.

To build:

make clean
make

To run:

./imx6sx_vdac_pxp

Optional parameters

-t Time in seconds to run
-h flip image horizontally
-v flip image vertically
-i invert colours


Sunday, 31 May 2015

i.MX6SX - UDOO NEO Early hardware prototype

A few weeks back I received an early hardware prototype of the UDOO NEO. It hosts one of the newer members of the i.mx6 family, the i.MX 6SoloX which integrates a Cortex A9 with a Cortex M4. This seems to be Freescales first venture with a heterogeneous SOC, interestingly there may more down the road with the introduction of the i.mx7 (Cortex A7 + Cortex M0/M4).

What is striking about the i.mx6sx architecture is that the primary processor is the Cortex A9. Therefore to boot the i.mx6sx requires uboot or (another i.mx6 bootloader) to be present. Once the A9 is active the secondary processor (Cortex M4) can be brought on-line, either through uboot or after the A9 starts running Linux. In either case, application code (and/or bootloader) has to be made available to the M4 via on board flash or loaded into RAM. The M4 is brought on-line via a Reset following the standard Cortex M4 start up process of booting from a vector table at address zero (which in the A9 world is mapped to 0x007f8000). From what I understand address zero seems to be 32K of TCM (Tightly-coupled memory). The M4 seems to be capable of interacting with most of the on-board peripherals provided by the i.mx6sx. Other areas of interests on the i.mx6sx are :
  •  An semaphore mechanism (SMEA4) to aid processor contention 
  • A messaging mechanism (MU) to enable the two processors to communicate and coordinate by passing messages
The i.mx6sx isn't intended as multimedia processor and this is highlighted by the lack of a VPU and the inclusion of a low spec GPU, the Vivante GC400T which claims up to 720p@60 fps and is limited to Open GL ES 2.0.

What is unusual is the inclusion of the video analogue to digital converter (PAL/NTSC). Given that Freescale target market is Automotive I suspect this is for applications like reverse parking.

In it's current form the NEO has a small footprint  (850mm x 600mm)  and the notable on board peripherals are:
  • Ethernet (10/100) - KSZ8091RNA
  • Wifi/bluetooth - WL1831MOD
  • HDMI  - TDA19988
  • Gyroscope - FXAS21002C
  • Accelerometer + Magnetometer - FXOS8700CQ
  • 2 User defined LED's
  • 1 USB 2.0 type A female connector
The FXAS21002C and FXOS8700CQ are currently interfaced through i2c and the datasheets indicate a maximum clock speed of 400Khz (in Fast Mode).

 

Software for the NEO is still in early stages of development, so it's taken me a couple of weeks of development effort to get uboot and a kernel functioning. Above is a short video demonstrating the fruits of my efforts. On power up we launch a small application on the M4 from uboot which toggles (flashes) the Red LED and continually outputs a message on Uart 2 (shown later in the video). We then boot the kernel to a Ubuntu rootfs and launch an Open GL ES application on which outputs a rotating square to hdmi (720p).  The speed of rotation increases the more the board is tilted to the left or right (using the accelerometer). Note the Red LED continually toggles highlighting the fact that the cortex M4 is uninterrupted by the A9.

Having started to develop code for the TDA19988 interface, one useful feature of this controller is the ability to support DVI or HDMI displays. The downside being that the GPU struggles past 720p in my tests.

Another point to note is that there is no on board flash for the M4. Therefore code for the M4 has to be loaded into RAM which means reserving a small portion away from kernel. This introduces another layer of complexity as the reserved area needs to be protected from accidental access.

Developing for the board definitely gives you a good insight in its usability. This leads to my wish list, if the board were to be improved :

1. 2 x USB 2.0 type A connectors, for keyboard & mouse useful when used with HDMI.
2. 2 x Ethernet (definitely make it stand out from the competing SBCs).
3. Use SPI for FXAS21002C & FXOS8700CQ or ideally replace with 9 axis IMU.
4. I'd prefer the sensors to be on a separate pcb that can be connected via a jump cable to the main board. The main reason being that manoeuvring the board to test the sensors with the cables attached (ethernet, serial, power, usb, hdmi) isn't that practical.
5. On board JTAG support, it is very difficult to debug the M4 code without it!
6. There are two uart outputs, one for A9 serial console and other for the M4 on the NEO. Similar to the UDOO it would be very useful if these were exposed via single usb connector. In the current design you need to hook up 2 serial-to-usb adapters.

Wednesday, 15 April 2015

IOT - ESP8266 (ESP-201) + CC1110 (XRF)

Overall past few months there has been a huge amount of interest in the ESP8266 which offers a low cost Serial to WIFI solution. The introduction of the ESP8266 by Espressif coincidences with the increased hype of IOT (Internet of Things) as the next major emerging technology. Realistically IOT has been around for a decades in different disguises so the concept isn't new. I'm guessing the renewed interested is partly because sensors, processors and networks are dirt cheap and wireless technology is ubiquitous. Coupled with the rise of the maker culture it now means the cost of entry is low for connecting devices.

The are numerous modules available that host a ESP8266 and these can be found for a few dollars, however most have limited pins broken out, UART plus some GPIO. In my opinion a better option is the EPS-201 which has a larger number of pins broken out and an external U.FL antenna connector (IPX antenna). With a slight modification the module can be made breadboard compatible.



The main drawbacks with the ESP-201 are:

1. Lack of CE/FCC compliance (the ESP8266 is not the module).
2. The pin marking are on the underside of PCB therefore not visible when sitting in a breadboard.
3. The UART pins protrude on the underside of the PCB which means it can't be plugged into a breadboard without bending these pins at a right angle (as shown) or ideally de-soldering the pins and re-attaching the correct way round.



To evaluate the ESP8266 I chose to integrate it with an existing wireless (868Mhz) set up which is used for monitoring a number of sensors as well providing periodic temperature readings.  The existing radio set up uses Ciseco's XRF  which hosts TI's low power CC1110 transceiver. It runs custom developed firmware that is controlled through one of the two CC1110 UART ports. The plan was extended the firmware so that CC1110 could be controlled over TCP by running a socket server on the ESP8266. Compared to the ESP8266 the CC1110 has a wealth of reference documentation which increases the options for interfacing with other devices in addition to easing the programming burden. The main drawback with the CC1110 is the cost of the development tools although it possible to use SDCC (Small Device C Compiler) as an alternative.

Initially I was hoping to use I2C/SPI as the interface between the CC1110 and the ESP8266. However due to the CC1110 not supporting hardware I2C and coupled with the fact that I had just a few free I/O pins remaining I was left with one option that was to use the second UART port.


Espressif provide an SDK that can be flashed to the ESP8266 which provides a simple AT command style interface to program the device. Note, the SDK is continually being updated so check for later releases. One quirk with the ESP-201 is that is IO15 has to be grounded for the device to function. To flash the device IO00 has to be grounded. Instructions for SDK set up on Linux can be found here. To flash the AT firmware for SDK 1.0 on Linux we can issue the following command:

esptool.py write_flash 0x00000 boot_v1.2.bin 0x01000 at/user1.512.new.bin 0x3e000 blank.bin 0x7e000 blank.bin

After flashing the ESP-201 the bulk of the coding was done on the CC1110 which mainly entailed sending AT commands to the ESP-201 to initial a connection to the,  launch a socket server and send updates from the sensors. The sequence of AT commands was similar to this:

AT+RST
ATE0
AT+CWMODE=1
AT+CWJAP="SSID","Your Password"
AT+CIPMUX=1
AT+CIPSERVER=1,80

After coding up the AT commands on the CC1110 I could test by launching a telnet session on port 80 to the ip address allocated via DHCP from the AP. Output sensor data from both the CC1110 UART port and ESP-201 is shown below.




Coding the above highlighted a number of pitfalls with the AT firmware and hardware.

  • It can be tedious to parse the verbose and inconsistent responses returned by the ESP8266 to AT commands. To tone down the verbose responses  I used ATE0, however its not permanent so needs to be sent on a reset.
  • Resetting (AT+RST) or joining an access point (AT+CWJAP) can be slow therefore you need to carefully select relevant time out values.
  • STA mode (AT+CWMODE=1) can silently disconnect after a random time.
  • The ESP8266 isn't particularly well suited as a battery powered device because it can consume up to 300mA.

It is possible to write your own firmware instead of using the pre-built AT firmware, which in my opinion is a better option. Espressif provide a set of closed sourced C libraries which offers a finer level of control compared to the AT firmware. Having  spent a considerable amount of time writing custom firmware to interface to the CC1110, here's my findings:

  • Although there is a second UART available on the ESP8266, in most circumstances only the TX pin available (primary use is for debugging) because the RX pin is used as part of the SPI interface for the flash memory.
  • There is no in-circuit debugging option, your reliant on sending tracing output to the UART port or somewhere else.
  • Although SSL support is provided, it seems to be a hit and miss affair between SDK versions.
  •  The API is closed source, so your reliant on Espressif providing regular updates for new features or bug fixes.
  • No hardware encryption.
  • Not all I/O features are available eg RTC or I2C.

Given the amount of attention the ESP8266 has received it is fair to say it does offer a low cost and rapid approach to prototyping a WIFI solution for existing hardware or for a new application. However you could argue that most of the attention has come from hobbyists and not commercial ventures. In my opinion I think it is worth exploring other WIFI SOC's that coming to the market this year such as:


Furthermore it still not clear whether WIFI (2.4GHz or 5GHz) is the ideal medium for wireless IOT as the wake up and connect times aren't particular quick. The other point to make is that cost of the some of the above SOCs can make them overlap with traditional networking SOC which are used in low cost router boards. One example is the AR9331 which supports a full Linux stack and can be used for video streaming or complex routing something the WIFI SOC's may find hard to achieve.

Sunday, 25 January 2015

RK3288 - Firefly development board

I received this board just over a month ago from the Firefly team and have been keen to assess it development capabilities given its hosts a quad core Cortex A17 (or technically a Cortex A12) processor. 


On board is a Rockchip RK3288 processor which on initial glance has a pretty decent specification:

  1. 4Kx2K H.264/H.265(10-bit) video decoder
  2. 1080p H.264 video encoder
  3. Up to 3840X2160 display resolution
  4. 4Kx2K@60fpsHDMI2.0
  5. eDP 4Kx2K@30fps
  6. According to Rockchip the GPU core is listed as a Mali-T764 GPU although it's reported as a Mali-T760 by the Mali drivers.
  7. Ethernet Controller Supporting 10/100/1000-Mbps 

Given the above I think it is import to clarify what 4Kx2K actually means and the clue is in point 3. Having spent many hours debugging the kernel driver display code it turns out the RK3288 has 2 display controllers know as VOP_BIG and VOP_LIT (I presume for little). VOP_BIG support a maximum resolution of 3840x2160 which equates to 4K UHD (Ultra high definition television) and for VOP_LIT its 2560x1600 (WQXGA). Each controller can be bound to a display interface eg HDMI, LVDS or eDP (sadly missing on the firefly).  If you define 4Kx2K as 4096x2160 also know as DCI 4K then the definitions can be misleading. The numbers also align with H264/VP8/MVC decoding which max at 2160p@24fps (3840x2160), although the HEVC(H265) decoder contradicts this by  supporting 4k@60FPS (4096x2304). What is also interesting is the image processing engine can up scale to 3x input, which would imply 720p can be up scaled to 4K UHD.

The Firefly seems to be based on a Android TV box reference design and it could be argued that its targeted as an Android centric development board. The noticeable peripherals are:

1. VGA support + HDMI
2. On board microphone
3. Wifi Connector (RF Connector SMA Female)
4. IR Sensor
5. eMMC 16Gb

Firefly supply a pre-built dual boot image (on eMMC) that offers Android 4.4
and Ubuntu 14.04.

Android 4.4


Firefly supply the Android SDK sources so that customised images can be built from source. What is nice is that the Android Studio integrates well with the Firefly which eases development of Android Apps especially for newcomers. Furthermore the App can be remote debugged while running on the Firefly. I  suggest that you sign your App with the platform key from the SDK to ease integration when remote debugging. One pitfall to bear in mind is that Android 4.4 implements selinux so you may find accessing I/O via sysfs (eg GPIO) from your Android App is severely restricted. 

Ubuntu 14.04


The Ubuntu image uses a patched version of the Android kernel and unfortunately has no support for GPU/VUP acceleration.

Linux Support


Historically many ARM SOC vendors have side stepped any request for providing meaningful Linux support and instead rely on the developer community to progress this as far as they can. Unfortunately the situation is normally exacerbated by the lack co-operation for GPU/VPU support with SOC vendor. What's clearly not recognised by ARM is that this fragmentation is clearly benefiting Intel with their latest lower power SOC's having far superior out of box Linux support.

As of today I would argue the RK3288 falls midway between no and full Linux support. The reason for this is Rockchips effort to develop a Chromebook device, if you examine the Chromium OS source tree will find numerous patches submitted from Rockchip

So the question becomes can we make use of that source tree? Well my initial aim was to get a minimal kernel booting and ideally test if GPU support was possible. The video below shows the progress made after numerous weeks of attempting to bring a workable kernel. In the video I launch Openbox under X and run es2gear,glmark-es2 and some WebGL samples in Chromium.



Although the video may seem impressive the only GPU acceleration available is through EGL/GLES hence the WebGL examples are accelerated. What is important to bear in mind is that the xf86-video-armsoc driver lacks 2D support for Rockchip therefore this still is fair amount of software rendering in X11 plus I implemented workarounds for broken code. Furthermore performance isn't particular spectacular, es2gear & glmark-es2 numbers are here. Unfortunately I haven't had the time to investigate the cause(s) however further progress may be hinder by the lack of newer Mali drivers from Rockchip/ARM.

For those of you expecting a future Rockchip Chromium OS image to work on an existing RK3288 device you may be disappointed unfortunately the hardware differences make this a slim possibility.