(Very Experimental) TypeScript Support in the Moddable SDK

It is my pleasure to introduce this blog post written by Josué Kpodo, who has joined the Moddable team for the summer as an intern. Josué comes to us from Michigan State University where he recently completed his undergraduate degree in Computer Science. In this post, Josué will share a bit about his background and interests, the project he is working on this summer, and his progress in learning JavaScript and applying it to embedded systems. - Andy Carle

Developing ECMAScript Modules for Reliable Sensor Metadata

My name is Josué Kpodo and I am currently working as a summer intern for the Moddable team. My work consists of researching the properties of trustable sensor data as well as developing JavaScript APIs for making that data more accurate for data science purposes.

I recently graduated from Michigan State University with a B.S. in Computer Science and a minor in Technology Systems Management. Both fields are at the intersection of my passion and goals to create autonomous systems that would benefit developing countries in the fields of manufacturing, remote sensing, and digital farming.

This summer, my work is targeted at embedded systems that run JavaScript. Moddable is part of Ecma TC53, the standards body that specifies APIs for JavaScript modules for embedded systems. My work will help inform the development of TC53 APIs for sensor data and metadata.

An important part of my interests is to collaborate more with non-embedded developers on hardware-based projects. The work being done by TC53 and Moddable makes it much easier for these non-embedded developers to get started with embedded systems.

Partnership Between Moddable and MSU

Speaking of collaboration, one of the collaborators at TC53 is Michigan State University (MSU), and specifically Professor Joshua Siegel. Dr. Siegel is an Assistant Professor in Computer Science and Engineering at MSU. His research projects involve the development of automotive sensing systems and embedded artificial intelligence. A detailed description of his research work can be found on his webpage.

Through their work at TC53, Moddable and Professor Siegel are collaborating on the sensor metadata project I am working on for this summer. The main goal of this project is to improve the role of sensor data in data science. More precisely, we aim to create a standard API for sensor provenance by encoding metadata along with samples to improve data utility—mainly for data science applications.

The first part of my work is to conduct interviews with potential stakeholders and end users. These are primarily data scientists, cloud application developers, and even high-level embedded application developers. This interview is crucial as it helps us to understand our stakeholders' needs regarding trusting their sensors' data. Thus, this will help in defining the attributes of what makes data useful and more accurate.

The last part will focus on developing the specifications of standard APIs, based on the results of the interviews. These APIs will enable sensor manufacturers and driver developers to provide standardized documentation on how data and metadata (e.g latency, timing) are configured for sensor readings. Such documents will, in turn, allow our end users to use that metadata to improve their workflow. For example, data scientists could use such metadata to make further decisions about how to analyze the sensor readings before applying or deploying their machine learning models. In the case of autonomous cars, metadata such as timing and latency are mission-critical. By having access to how they were configured, a data scientist can determine if the sensor readings are indeed reliable before they are used for training sensor fusion algorithms such as RTLS or Bayesian SLAM. In the manufacturing sector, information such as the assembly line a particular robot belongs to, or the last time its firmware was updated, could represent potential important metadata needed to improve task automation and reduce inconsistencies during manufacturing. These are just some examples of how metadata could be used.

My Progress

My engineering work thus far has mostly involved working with sensors and writing applications for devices like the NodeMCU ESP8266 board. The Moddable SDK contains useful APIs and drivers which make development very practical and straightforward. For example, I was able to write an application that exposes the orientation of a device to the Adafruit IO cloud platform using just the Moddable IO and socket modules. Working on this application revealed to me the potential of the Moddable SDK for any type of embedded development.

Detecting the orientation of a device

The efficient handling of sensor data is at the core of our goals. To this end, I found it convenient to develop a simple JavaScript application that can detect the orientation of the ESP8266 board. My goals for this application are to understand data filtering techniques and connect to a cloud service from the SDK. The complete source code for this application is available at this GitHub repository.

This application uses the LSM9DS1 IMU for data collection, but any other accelerometer sensor could have worked as well. All you need is an account on Adafruit IO, the ESP8266, your sensor (LSM9DS1 IMU), and of course the Moddable SDK! I detail some code snippets of the application logic below.

In our application, there are six possible orientations: up, down, face-up, face down, left, and right. Let us configure the accelerometer sensor to detect the orientation. Once detected, it is uploaded to an Adafruit IO feed for visualization.

The LSM9DS1 is a 9DOF (degrees of freedom) motion-sensing system that houses a 3-axis accelerometer, 3-axis gyroscope, and 3-axis magnetometer. Using the LSM9DS1 driver, we can configure our LSM9DS1 to use any of the three sensors (magnetometer, gyro, or accelerometer). Here we configure it to use the accelerometer:

import LSM9DS1 from "lsm9ds1";

let sensor = new LSM9DS1({ sda: 4, scl: 5});
sensor.configure({operation:"accelerometer"});

Once configured, we can retrieve the data on the x, y and z axes.

let sample = sensor.sample();
trace(`ax: ${sample.x}, ay: ${sample.y}, az: ${sample.z}\n`);

The main code logic is as follows. Here, we sample the accelerometer data every 10 milliseconds to prevent any latency and fetch instantaneous responses. We also keep track of the previous orientation and only update when a new orientation becomes available. Only then do we upload the orientation to Adafruit IO:

let prevFace;

System.setInterval(() => {
    let sample = sensor.sample();
    let face = getOrientation(sample); // e.g. "left", "right"

    if (face !== undefined && face !== prevFace){
        prevFace = face;
        streamData(face);
    }

}, 10);

The accelerometer measures how fast velocity is changing in g's (multiples of standard gravity at sea level, about 9.8 m/s2). That is, a motionless object feels about 1g of acceleration towards the ground at sea level. With this information, we can easily detect which axis is being pulled downwards (-1g) or upwards (+1g). For example, let us consider the X-axis on the accelerometer. Let us also assume that its orientation is relatively parallel to that of the device (when the device is laying flat, like on a table). When sampling, if the readings, from the X-axis, are converging to a threshold of 1g, we can infer that the device is being held up. In contrast, if they converge to -1g, the device is orientated downwards. I was more restrictive in my code as I chose 0.8 instead of 1 for my orientation threshold.

There is one caveat, however. And it is specific to the data sheet of the sensor. The following explanation only applies to the LSM9DS1 and may diverge depending on your sensor. We know from the LSM9DS1 data sheet that the accelerometer values range from -2g to 2g at best. To keep our filtering optimal, we can further reduce the range down to ±1g (between -1 and 1). To do so, we sum up the sampling values on each of the 3 axes. We then find the magnitude of the sum and restrict it to be close to a filtering threshold of ±1g (In my code, I chose such restriction to be 1.2).

Again it all depends on the sensor's configuration. The TC53 Sensor Class Pattern provides an extensible mechanism for configuring sensors, allowing configuration of sensor interrupts, data sampling rate, FIFO modes, and more. By understanding the sensor's configuration, the user (e.g., the application developer) can have a better interpretation of the sensor's readings. Therefore, such configurations represent the kind of metadata we plan on exposing as part of the standard APIs we are developing for the summer.

Continuing with our code logic, we can define a function getOrientation() that checks all those conditions for us and returns the appropriate orientation:

// The following constants are defined in main.js
const ORIENTATION_THRESHOLD = 0.8
const FILTERING_THRESHOLD = 1.2

function getOrientation(sample){
    ...
    let sumxyz = Math.abs(sample.x + sample.y + sample.z);

    // Reduce the range down to about ±1
    if (sumxyz < FILTERING_THRESHOLD){
        if (sample.x * sample.y > 0 && sample.x >= ORIENTATION_THRESHOLD) {
            trace("right side up\n");
            orientation = POSITIONS[5];
        }
        else if (sample.x * sample.y < 0 && sample.x <= -ORIENTATION_THRESHOLD ) {
            trace("up side down\n");
            orientation = POSITIONS[4];
        }
        ...
    }
}

How can we connect and upload to our Adafruit IO feed? After creating an Adafruit IO feed, we can configure our token and username in the manifest.json file:

...
"config": {
        "username": "<user_name>",
        "feedKey": "<your_feed_key_here>",
        "AIOKey": "<your_AIO_key_here>",
    },
...

The following function sends data to the feed with an HTTP POST request:

function streamData(data) {
    let request = new Request({ 
        host: "io.adafruit.com",
        path: `/api/v2/${config.username}/feeds/${config.feedKey}/data`,
        method: "POST",
        headers: [ "X-AIO-Key", config.AIOKey, "Content-Type", "application/json" ],
        body: JSON.stringify({ value: data }),
        response: String,
    });
    request.callback = function(message, value, etc) {
        if ((message == 2) &amp;&amp; (value == "status")) {
            if (etc == "200 OK") {
                trace(`Sent data "${data}"\n`);
            } else {
                trace(`Error sending data "${data}". Error code: ${etc}\n`);
            }
        }
    }
}

To see this application in action, please refer to the following video:

Event-driven programming

This programming paradigm is indeed new to me since I have a C++ background. The Moddable team has been assisting me, through code review, in learning the event-driven way of programming. The team has even challenged me to write an application that monitors the duration of buttons presses and replays those presses as a sequence of blinks on a LED. Despite its simplicity, developing such an application has made me realize the importance of JavaScript functionalities such as callbacks and timers. A detailed description of this application can be found in the video below. As of now, it feels more natural to write event-driven code for embedded devices in order to handle inputs accordingly while doing other things as well. The full source code is available here.

What's Next

Moving forward, there are a couple of things I would like to achieve. At first, I hope to get valuable insight from my one-on-one interviews with our stakeholders. I would like to understand our end users' processes for selecting their sensors as well as the source of such sensors. Among other things, I hope to get their inputs on what characterizes trustable (or unreliable) sensor data and then discover the best way to encode metadata about those sensors for data science purposes.

Another thing I would like to achieve is to improve on reading and understanding diverse sensor data sheets as this is crucial for encoding metadata. As of now, I already explored the LSM9DS1 IMU data sheet and documented possible configuration options available to developers. Some examples of these options involve configuring the output data rate of the sensors, or even their operating modes. Since my work is dependent on configuring sensors, learning how to interpret data sheets, for a variety of types of sensors, is a skill I am striving to develop. To this extent, I will work closely with the Moddable team and use online resources such as SparkFun or Adafruit.

In conclusion, it is my hope that this summer project will lead to the development of a standard API which will make life easier for developers who intend to use their sensors' data for machine learning and artificial intelligence applications.