Supporting sensors in Windows 8

Recent advances in sensor technology are catalysts for the acceleration and evolution of user experiences on PCs. The ability to react to changes in ambient light, motion, human proximity, and location are becoming common and essential elements of the computing experience. Even something simple—like an ambient light sensor to adjust display brightness in a room with changing light—is potentially a basic scenario for desktop PCs. Of course, we also want to make sure you have full control over the use of these peripherals, since we know that different sensors leave open opportunities for risk or abuse that some folks might not be comfortable with. This post looks at the details of supporting sensors in Windows 8 and was authored by Gavin Gear, a PM on the Device Connectivity team.
--Steven


The first thing we explored about sensors was how Windows 8 should use them at the system level, to adapt the PC to the environment while preserving battery life.
[h=3]Adaptive brightness[/h] The first system feature was automatic display brightness control, or what we call “adaptive brightness.” This was a feature that we first introduced in Windows 7 using ambient light sensors (ALS), and is targeted at mobile form factors like slates, convertibles, and laptops. With today’s display panels supporting brightness levels at approximately twice the intensity of what was common just a few years ago, this feature is more important than ever. By dynamically controlling screen brightness based on changing ambient light conditions, we can optimize the level of reading comfort, and save battery life when the screen is dimmed in darker environments.
A tablet PC in harsh outdoor lighting with adaptive brightness (left), and without (right)
You can see here that adaptive brightness helps you see content on the screen more clearly, since the screen automatically gets brighter when the tablet enters a bright environment. And for those of you who use your desktop PCs in a sunny room, you know this same thing can happen at different times of the day in different seasons.
[h=3]Automatic screen rotation[/h] Many smartphones and other mobile devices have established the expectation that when you rotate the device, the graphic display will also rotate and adapt to the new orientation (including adapting to aspect ratio changes). Data from an accelerometer allows the device to determine its basic orientation. By automatically rotating the screen, people can use their devices (primarily slates and convertibles) in a more natural and intuitive way, without needing to manually rotate the screen with software controls or hardware buttons.
Windows 8 Start screen in landscape and portrait orientations
[h=3]Developer support for sensors[/h] Beyond figuring out the basics for how a Windows 8 system might use sensors, we also needed to think about how apps might use sensors. We looked at a variety of examples of sensor-enabled apps including games, commercial applications, tools, and utilities, to help us determine which scenarios to support.
First on the list was the ability for apps to understand motion and screen rotation. This requires an accelerometer – a device that can be used to measure the force due to gravity, and the motion of the device itself. But most scenarios require more than just an understanding of motion and gravity. Orientation is also an important requirement for many applications. To enable a PC to understand orientation we needed to integrate the functionality of a compass.
Supporting a compass would at minimum require a 3D accelerometer (which measures acceleration on three axes) and a 3D magnetometer (which measures magnetic field strengths on 3 axes). This combination of sensors is called a 6-axis motion and orientation sensing system, and can support a basic tilt-compensated compass, screen rotation, and certain casual game apps like a labyrinth style game. However, in our testing and prototyping, we found the 6-axis motion sensing system has two key drawbacks: sporadic compass inaccuracy, and a lack of the responsiveness required by 3D interactive games.
Recently, a new type of sensor has started to emerge on phone platforms – the gyro sensor. Gyro sensors measure angular speed, typically along 3 axes. You can also use the data from gyro sensors to increase the responsiveness and accuracy of 3D motion-sensing systems. A gyro sensor is very sensitive, but it lacks any form of orientation reference (such as gravity or north heading).
This diagram shows how gyro data is represented as a set of three rotations along the three primary axes for the device:

Initially, some thought that the need for such sensors was scoped to very few apps, such as specialized games. But the more we examined the 3D motion and orientation sensing problem, the more we realized that applications are much more immersive and attractive if they react to the kind of motion humans can easily understand, such as shakes, twists, and rotations in multiple dimensions. With these kinds of sensors it would certainly be possible to build very immersive 3D games, but it would also enable lots of other apps to more naturally respond to input from a variety of motions, including mapping and navigation applications, measuring utilities, interactive (between two machines) applications, and simple apps like casual games.
[h=3]Engineering challenges[/h] We started our exploration into motion apps by prototyping some 3D experiences. The first challenge was to map the physical orientation of the device directly to a virtual 3D environment in the app. We decided to model a simple augmented reality experience by emulating a tablet as a window into a virtual world. The concept was fairly simple: when you move the device while looking at the screen, the virtual environment (the inside of a room) would appear to stay stationary.
Initially, we tried an experiment using the accelerometer to map up and down movement of the device to up and down movement of the 3D environment in response. When you hold the device still, the scene should remain stable. When you tilt the device, the view should tilt up or down. Right away we encountered an issue: “noise” in the data from the accelerometer sensor was causing jittery movement of the 3D environment even when the device was held stationary. We were able to see this noise clearly by capturing accelerometer data and charting it.

Without noise, the lines on the chart would be straight, with no vertical deviation. The conventional way to remove such noise is to apply a low-pass filter to the raw data stream. When we implemented this mitigation in our prototype, the resultant motion was smooth and stable (jitter-free). But the low-pass filter introduced another problem: the app lost responsiveness and felt sluggish when responding to motion. We needed a way to compensate for this jitter without reducing responsiveness.
The next experiment was to provide the ability to “look left” and “look right” in our virtual 3D environment app. We used a 6-axis compass solution (3D accelerometer + 3D magnetometer) to support this type of movement. Although this kind of worked, the movement was not consistent due to the general instability of the 6-axis compass. It was also challenging to blend the up-and-down movement with the left-and-right movement.
From these experiments it was clear that this combination of sensors could not provide the fluid and responsive experience we wanted. The accelerometer sensor was not providing clean data, and could not be used alone to determine device orientation. The magnetometer was slow to update and was susceptible to electromagnetic interference (think of a compass needle that sticks in one position occasionally). We had yet to experiment with the gyro sensors, but because gyros could only determine rotational speed, it wasn’t clear how they could help.
[h=3]Creating “sensor fusion”[/h] But further experimentation demonstrated that using all three sensors together could solve the problem. It turns out that an accelerometer, magnetometer, and a gyro can complement each-other’s weaknesses, effectively filling in gaps in data and data responsiveness. Using a combination of these sensors it is possible to create a better, more responsive, and more fluid experience than the sensors can provide individually. Combining the input of multiple sensors to produce better overall results is a process we call sensor fusion.
Essentially, sensor fusion is a case where the whole is greater than the sum of the parts. A typical sensor fusion system uses a 3D accelerometer, a 3D magnetometer, and a 3D gyro to create a combined “9-axis sensor fusion” system. To understand how this system works, let’s take a look at the inputs and outputs.

9-axis sensor fusion system
This diagram shows two types of outputs: pass-through outputs in which the sensor data is passed directly to an application, and sensor fusion outputs in which the sensor data is synthesized into more powerful data types.
Some applications can use pass-through sensor data directly. This data can be used at “face value” for a variety of scenarios. One such scenario is an app that implements a pedometer to count your steps as you walk. The graph below shows the output of the accelerometer for a person walking with a tablet PC. This graph clearly shows it is possible to detect every step the person took.


But, as our experiments revealed, many applications can’t effectively use the raw sensor data. Some of these applications include:

  • Compass apps
  • Enhanced navigation and augmented reality apps
  • Casual games
  • 3D gaming apps
Here’s a screenshot from a 3D game sample:

3D first-person shooter game (shown at //Build/)
These applications need to use sensor fusion data in order to support the features they implement. The “magic” of sensor fusion is to mathematically combine the data from all three sensors to produce more sophisticated outputs, including a tilt-compensated compass, an inclinometer (exposing yaw, pitch, and roll), and more advanced representations of device orientation. With this kind of data, more sophisticated apps can produce fast, fluid, and responsive reactions to natural motions.
By integrating a sensor fusion solution, Windows 8 provides a complete solution for the full range of applications. Sensor fusion in Windows solves the problems of jittery movement and jerky transitions, reduces data integrity issues, and provides data that allows a seamless representation of full device motion in 3D space (without any awkward transitions).
[h=3]Working with hardware partners[/h] While designing a sensor fusion solution for Windows, we also needed to help hardware designers to take advantage of this solution by partnering with them early. Designing a sensor fusion system is relatively easy if you’re designing a single device. But Windows runs on many kinds of PCs in many form factors, using hardware components from many different manufacturers. We needed to provide a solution that enabled the entire ecosystem of Windows hardware partners to participate.
The first step was to provide a baseline of performance for sensor packages that would work with Windows’ sensor fusion solution. Using Windows certification guidelines, we provided specifications for sensor performance. To help hardware companies verify that their solutions were compatible with Windows, we built a number of tests, which we provide with the Windows Certification kit.
Reducing the cost of developing and supporting drivers was another challenge. In order to make it simpler for sensor hardware manufacturers and PC makers, we wrote a single Microsoft-supplied driver that would work with all Windows-compatible sensor packages connected over USB and even lower power busses like I2C. This sensor class driver enables hardware companies to innovate with sensor hardware while ensuring that their hardware can be supported easily with drivers that ship with the Windows operating system.
To help speed adoption of the class driver, Microsoft worked with industry partners to introduce the specification into public standards. In July 2011 the standard for sensors was introduced in the HID (Human Interface Device) specification of the USB-IF (HID spec version 1.12, introduced with review request #39). This standardization enables any sensor company to build a sensor package that is compatible with Windows 8 by following the public standard USB-IF specifications for compliant device firmware. This reduces the time and cost required to integrate sensor hardware with Windows 8 PCs. Other benefits include a lower support cost and more consistent hardware capabilities for Windows 8 PCs that are equipped with sensors.
But beyond standardizing the class driver, we also wanted to optimize the performance of the sensor fusion solution, and minimize its impact on battery life. Each active sensor on a system draws power, and sending data up the stack consumes both memory and CPU time. We helped minimize the power and performance impact for sensor fusion systems running on Windows 8 in two major ways:
1. We architected the sensor fusion interfaces in Windows 8 to enable much of the processing of sensor fusion data to happen at the hardware level. This hardware-level sensor fusion capability means that computationally expensive algorithms don’t have to run on the main CPU, saving power and CPU cycles.
2. We implemented powerful filtering mechanisms that we tied directly to the needs of sensor apps running at any given point in time. This pay-for-play data and event model means that sensor data is only sent up the stack at the rate that apps need that data, and no faster. This results in greatly reduced CPU utilization for sensor data throughput.
[h=3]Sensors and Metro style apps[/h] To pull all of this together, our final challenge was to make the power and promise of sensor fusion available to those writing Metro style apps. To enable this, we designed a sensor API as part of the new WinRT. Through these APIs, developers can access the power of sensor fusion from any Metro style app. These APIs are clean and simple, and at the same time give developers access to the data needed to support everything from casual games to virtual reality applications. Of course these capabilities are all available as Win32 APIs for game developers or other uses in desktop applications.
The following JavaScript code snippet shows how easy it is to get access to an accelerometer and subscribe to events using the Windows Runtime:

var accelerometer;
accelerometer = Windows.Devices.Sensors.Accelerometer.getDefault();
accelerometer.addEventListener("readingchanged",onAccReadingChanged);

function onAccReadingChanged(e) {
var accelX = e.reading.accelerationX;
var accelY = e.reading.accelerationY;
var accelZ = e.reading.accelerationZ;
}
For more information about support for sensors in the Windows Runtime, please see this //build/ session on using location & sensors in your app.
You may be wondering at this point how you can try out sensor fusion on Windows 8, or even write some apps that use these new capabilities. Developers who attended the //build/ conference in 2011 received the Samsung Windows 8 Developer Preview slate PC, which included a full package of sensors. There were only about 4,000 of those given out, so of course, not everyone had the opportunity to get one. The good news is that the same 9-axis sensor fusion system that was built into the Windows Developer Preview device is now available online for purchase from ST Microelectronics. The “ST Microelectronics eMotion Development Board for Windows 8” (model # STEVAL-MKI119V1) attaches via USB, and works with the HID sensor class driver that’s included in Windows 8. If you’ve downloaded the Developer Preview version of Windows 8 and are itching to try out the sensor experience you should consider getting one of these devices.
3718.eMotion_5F00_Board_5F00_500_5F00_5ADC7B18.jpg
ST Microelectronics eMotion Development Board for Windows 8
Now let’s take a look at sensor fusion in action!
Your browser doesn't support HTML5 video.
Download this video to view it in your favorite media player:
High quality MP4 | Lower quality MP4
-- Gavin

aggbug.aspx

More...
 
The sensor system doesnt mention the options developers have in placing coefficients and other parameters required for complex computation in the sensors. In order to increase the accuracy of the sensor ouputs sometimes these are necessary.
These parametes can be written to the device if the HOST allows that when the OEM integrates the sensor. Would that still make the device compliant with Windows Hardware Certification?
 

My Computer

System One

  • OS
    Windows XP
Allow me to add that it is expensive to buid a device with an on-chip flash to hold these values which are different for each phone/tablet etc
 

My Computer

System One

  • OS
    Windows XP
. . .just promise me that there will be no "Test" on this subject matter on Friday. . .:thumbsup:
 

My Computer

System One

  • OS
    Win 8, (VM win7, XP, Vista)
    Computer type
    PC/Desktop
    System Manufacturer/Model
    HP Pavilion p1423w
    CPU
    Intel Core i5 3330 Ivy Bridge
    Motherboard
    Foxconn - 2ADA Ivy Brige
    Memory
    16 GB 1066MHz DDR3
    Graphics Card(s)
    ATI Radeon HD 5450
    Sound Card
    HD Realteck (Onboard)
    Monitor(s) Displays
    Mitsubishi LED TV/Montior HD, Dell 23 HD, Hanspree 25" HD
    Screen Resolution
    Mit. 1980-1080, Dell 2048-115, Hanspree 1920-10802
    Hard Drives
    1 SanDisk 240Gig SSD, 2 Samsung 512Gig SSDs
    Case
    Tower
    Cooling
    Original (Fans)
    Keyboard
    Microsoft Keyboard 2000
    Mouse
    Microsoft Optical Mouse 5000
    Internet Speed
    1.3 (350 to 1024 if lucky)
    Browser
    Firefox 19.1
    Antivirus
    MSE-Defender
. . .just promise me that there will be no "Test" on this subject matter on Friday. . .:thumbsup:

Don't worry, no test on Friday because I renamed that day ' SecondThursday':rolleyes:
 

My Computer

System One

  • OS
    8.1
    Computer type
    PC/Desktop
    CPU
    i7-3770K
    Motherboard
    ASRock Z77 Extreme4
    Memory
    16 GB
    Graphics Card(s)
    onboard
    Monitor(s) Displays
    17" 24"
    Hard Drives
    1 TB WD
    PSU
    550w
if they would get rid of the high gloss screens....

and switch to matte you wouldn't need a brightness control that much. most of these pads and tablets are practically unreadable during daylight hours
 

My Computer

System One

  • OS
    win 7
Adaptive Brightness? Adaptive Brightness was brand new in G5 iMacs, this was right before the ill fated move to Intel cores. I actually have an A1058, its got Adaptive Brightness in it.
 

My Computer

System One

  • OS
    Windows 8 Pro with Media Center/Windows 7
    Computer type
    PC/Desktop
    System Manufacturer/Model
    Asus M2N-MX SE Plus § DualCore AMD Athlon 64 X2, 2300 MHz (11.5 x 200) 4400+ § Corsair Value Select
    CPU
    AMD 4400+/4200+
    Motherboard
    Asus M2N-MX SE Plus/Asus A8M2N-LA (NodusM)
    Memory
    2 GB/3GB
    Graphics Card(s)
    GeForce 8400 GS/GeForce 210
    Sound Card
    nVIDIA GT218 - High Definition Audio Controller
    Monitor(s) Displays
    Hitachi 40" LCD HDTV
    Screen Resolution
    "1842 x 1036"
    Hard Drives
    WDC WD50 00AAKS-007AA SCSI Disk Device
    ST1000DL 002-9TT153 SCSI Disk Device
    WDC WD3200AAJB-00J3A0 ATA Device
    WDC WD32 WD-WCAPZ2942630 USB Device
    WD My Book 1140 USB Device
    PSU
    Works 550w
    Case
    MSI "M-Box"
    Cooling
    Water Cooled
    Keyboard
    Dell Keyboard
    Mouse
    Microsoft Intellimouse
    Internet Speed
    Cable Medium Speed
    Browser
    Chrome/IE 10
    Antivirus
    Eset NOD32 6.x/Win Defend
    Other Info
    Recently lost my Windows 8 on my main PC, had to go back to Windows 7.
More than 50% of all US cell phones are iPhones which is persistently expanding its prior victory.

Naturally. If it works well on an iPhone, they want to duplicate that on other devices. I neer thought I would have one, much less use it and like it as much as I do. I'm on my 3rd one, I had an original iPhone 8gb, then an iPhone 4 now a 4S, I chose to stay with the 4S cos I've spent significant money on accessories that in no way can I replace at this time.

The factor that has made this quoted statement a fact is one simple thing: Universal Usefulness. Any other device maker is hard pressed to come up with something that can do what is available by default with an iPhone- Samsung IIIS's are close, but I can crush one of those with my hand - I cannot crush an iPhone, I would have to take a sledge hammer to it.

I'd like to see Windows 8 run on either an iPhone or a IIIS's though. There are Themes available to Jailbreak Users that make the iOS interface look a lot like Windows 8, couple that with the usefulness of many iOS apps, and it is ideal.

This is where Windows 8 fails so far - Usefulness. If I had Windows 8 right now in my iPhone 4S, I there are simply no apps to replace what I use. I will add a "Yet" to that. If Microsoft can lift themselves up by their bootstraps to bring cross platform usefulness into devices and tablets that have Windows 8, that would be the kick in the pants they need.
 

My Computer

System One

  • OS
    Windows 8 Pro with Media Center/Windows 7
    Computer type
    PC/Desktop
    System Manufacturer/Model
    Asus M2N-MX SE Plus § DualCore AMD Athlon 64 X2, 2300 MHz (11.5 x 200) 4400+ § Corsair Value Select
    CPU
    AMD 4400+/4200+
    Motherboard
    Asus M2N-MX SE Plus/Asus A8M2N-LA (NodusM)
    Memory
    2 GB/3GB
    Graphics Card(s)
    GeForce 8400 GS/GeForce 210
    Sound Card
    nVIDIA GT218 - High Definition Audio Controller
    Monitor(s) Displays
    Hitachi 40" LCD HDTV
    Screen Resolution
    "1842 x 1036"
    Hard Drives
    WDC WD50 00AAKS-007AA SCSI Disk Device
    ST1000DL 002-9TT153 SCSI Disk Device
    WDC WD3200AAJB-00J3A0 ATA Device
    WDC WD32 WD-WCAPZ2942630 USB Device
    WD My Book 1140 USB Device
    PSU
    Works 550w
    Case
    MSI "M-Box"
    Cooling
    Water Cooled
    Keyboard
    Dell Keyboard
    Mouse
    Microsoft Intellimouse
    Internet Speed
    Cable Medium Speed
    Browser
    Chrome/IE 10
    Antivirus
    Eset NOD32 6.x/Win Defend
    Other Info
    Recently lost my Windows 8 on my main PC, had to go back to Windows 7.
Back
Top