Skip navigation

PBXes were originally developed as a telephony solution for small to large businesses. Part of the idea of the PBX was a goal to share a large number of internal to the company phone numbers among a much smaller number of outside lines. Calls placed within the company’s PBX don’t place a demand on telephone company resources outside the company. Only calls external to the company are routed onto phone company lines.


Early VoIP solutions sometimes had audio quality problems. Much of this was due to channel congestion problems and other similar problems. Congestion happened when too much traffic was on one of the links to a subscriber or ethernet collision retry experienced multiple collisions when transmitting an audio packet from one subscriber to another.  The net effect included audio gaps, and bizarre sounds produced by the receiving CODEC. But today VoIP is a solid technology that can be implemented by physical hardware or software.




Lanner Inc (1) manufactures a series of Network Appliances including the FW-8910 Network Appliance which may be configured with a variety of software options including VoIP. Operating systems for the FW-8910 are Windows (2000, 2003, XP), Linux which dictates the software that may run on the appliance. Other Appliances from Lanner support other operating systems including FreeeBSD, NetBSD and others. Lanner’s appliances are intended to offer a platform for a variety of software packages. By providing a platform based on a network appliance model, Lanner offers an alternative to distributed hardware in small to medium sized enterprise systems.




The VoIP PBX software load permits the appliance to function as a full featured telephone system. Additional software can be configured to add security, VPN and traffic monitoring. The FW-8910 is built around the Intel® 3420 chipset and includes two Intel Xeon C5500 series processors. The Intel 3420 chipset supports eight PCI Express x1 ports configurable as x2 and x4, six SATA 3Gb/s ports, twelve USB 2.0 ports for I/O and Intel Matrix Storage Technology. The Xeon C5500 is an eight thread four core processor running at a 2.13 GHz. While this seems like a large amount of processor power, it’s important to recognize that these systems are designed to provision many hundreds, up to thousands, of simultaneous calls. In addition, users and administrators of VoIP systems may choose the number of bits per second that each phone call requires for intelligent conversation. Generally, VoIP provides satisfactory fidelity when using 128k bits per second. An enclosure Where Lanner offers a packaged product in an enclosure Kontron (2) offers a hardware Advanced Mezzanine Card module as a platform for embedded software such as VoIP. Like Lanner, Kontron’s board level product is also based on the Intel Xeon™ Quad-Core processor with the Intel 3420 server-class chipset. The combined chipset includes Hyper-Threading support and Intel Turbo Mode technology.


Trenton Technology, Inc. (3) offers a third form factor: the CompactPCI Plus (PCIMG1.3) also based on  two Multi-Core Intel Xeon C5500 Series Processors. Like the other two manufacturers, the JXT6966 incorporates the Intel 3420 Platform Controller Hub. These three alternative hardware platforms all use the Intel Xeon C5500 processor and the companion Hub Controller.  The diversity of form factors delivering a system based on the C5500 give systems designers the flexibility to choose how the functions will be delivered to users. Although great flexibility is provided from the hardware front, the question is what kind of VoIP software can be enabled on the respective platforms.


Open Source software can provide unexpected solutions. Asterisk from Digium is one of those solutions. Developed a little more than a decade ago, Asterisk brought many alternatives to computerized telephony. One of those alternatives is a full featured VoIP PBX system.  Asterisk is an Open Source phenomenon. In 2008 over 1.5 million downloads of Asterisk were made by all types of users:  from individuals looking to create their own phone system with features like voicemail and remote control, to large corporations seeking to provide enterprise-wide telephony. Asterisk was written assuming a Linux environment, but Microsoft (4) Windows is now supported as well. For those Xeon C5500-based hardware platforms that support Linux, targeting Asterisk is an automated task using AsterisNOW. This automates the full installation process if you are trying to create a number of applications including a VoIP PBX. But Asterisk includes many other capabilities: VoIP Gateway, Skype Gateway, IP PBX, Call Center Automatic Call Director, Conference Bridge, IVR Server, Voicemail System, Call Recorder, Fax Server, and Speech Server. Asterisk also enables multi-core hardware platforms through the extensive use of THREADS to structure the code. Threading is key to the effective use of multi-core, multi-CPU and beowul distributed hardware. Asterisk developers have adopted a rigorous standard employing an Asterisk-specific thread library. This technique enhances portability and permits Asterisk to compile for a variety of hardware platforms including the two core Xeon C5500-based products.


Software without tools to target a specific platform doesn’t do much for anyone. Developers have options for the choice of development tools. Intel offers an Eclipse enabled tool chain as does Wind River Systems (5). Green Hills Software(6) also offers a full tool chain that is also Eclipse compatible. Development tools were the subject of another blog, as was porting software to a new environment.


Open Source software Asterisk offers a wide array of options and solutions for telephony products. If you were developing a product using Asterisk, what would you do to differentiate your product over the out-of-the-box Asterisk software?



(1) Lanner Electronics Inc is an Associate Member of the Intel Embedded Alliance

(2) Kontron is a Premier Member of the Intel Embedded Alliance

(3) Trenton Technology, Inc. is an Affiliate Member of the Intel Embedded Alliance

(4) Microsoft  is an Associate Member of the Intel Embedded Alliance

(5) Wind River Systems is an Associate Member of the Intel Embedded Alliance

(6) Green Hills Software is an Affiliate Member of the Intel Embedded Allaince



Henry Davis
Roving Reporter (Intel Contractor)
Intel(r) Embedded Alliance

The telecom industry is facing growing bandwidth demand due to 3G and 4G wireless as well as video traffic.  To meet this demand, the industry is preparing to move from today’s 10G systems to higher-bandwidth 40G platforms.  Although the 40G AdvancedTCA (ATCA) standards are not yet finalized, designers can—and should—start designing for 40G today.  The shift from 10G to 40G ATCA is expected to begin this year, with full-fledged deployments starting in 2011.  Given that ATCA rollouts typically require a year or more, network equipment designers must start 40G designs now if they are to meet these timelines.


We will show how to get started on 40G in a moment, but let’s first review 10G ATCA technology.  Today’s 10G ATCA backplanes are based on the PICMG* 3.1 Option 9 standard.  As shown in Figure 1, the PICMG 3.1 Option 9 backplane implements a 10GBase-KX4 link, which is comprised of four serial lanes running at 2.5 Gbps each.



Figure 1. The 10GBase-KX4 backplane link is comprised of four serial lanes running at 2.5 Gbps each.


Moving forward, ATCA backplanes will be upgraded to meet the 10GBase-KR standard, which achieves a throughput of 10 Gbps per lane.  With the upgrade to 10GBase-KR, the backplane will offer four 10 Gpbs lanes.  The final step in the transition to 40G is to tie the four lanes together to create a single 40 Gpbs 40GBase-KR4 link.  These steps will take some time.  The 10GBase-KR and the 40GBase-KR4 technologies are expected to be ratified as part of the PICMG 3.1 R2 ATCA specifications in 2010.  10GBase-KR technology is expected to come to market in 2010, while 40GBase-KR4 systems are expected to appear in 2011.


As should be clear by now, 40G backplanes will largely resemble today’s 10G backplanes.  The primary difference is that the serial lanes will run at 10 Gbps instead of 2.5 Gbps.  The other key difference between 10G and 40G is power.  40G cards will need increased processing power, and that means more heat.  Thus, 40G shelves must be rated for higher heat dissipation.  Among other things, this means leaving more space between cards for better airflow and larger heat sinks.


While these are not trivial issues, the fact remains that 10G and 40G platforms have much in common.  In fact, some of today’s 10G platforms have been verified as “40G-ready”—that is, the platforms have been designed and tested for 40 Gbps traffic.  Examples of these platforms include the Emerson Centellis* 4440 and the RadiSys ATCA 4.0 Platform, which is based on the Promentum* SYS-6016.  Both of these platforms feature backplanes that support 10GBase-KR and 40GBase-KR4.  These platforms have also been designed with the increased cooling needs of 40G in mind. For example, the Centellis 4440 provides cooling that conforms to the CP-TA B.4 thermal profile, the highest level currently defined. This cooling ability dovetails with the shelf’s power distribution to allow blade power levels up to 350 Watts per board.


Of course, the shelf is not the only thing you need to get to 40G—you also need 40G switches and payload cards.  These cards are not available today, so you will need to start development with 10G components.  Once 40G switches are released—which is expected to happen later this year—you can replace your 10G switches.  Then when 40G payload cards are released in 2011, you can upgrade those cards as well.  Figure 2 illustrates this upgrade process.  Note that the chassis, cabling, and system installation remain intact throughout this process, allowing much of the integration and verification efforts to be re-used. The result is a reduction in the cost, complexity, and risk (as compared to starting a 40G design from scratch).

































Figure 2. The 40G upgrade path, using the RadiSys platform as an example.


To follow this path, you must first select a 40G platform that is compatible with existing 10G cards.  You’ll also want to look for platforms with pre-validated 40G and 10G components to avoid interoperability issues.  This compatibility is obviously essential for near-term deployments using existing 10G cards.  It will also serve you well over the long run, as many deployments have elements with lower bandwidth requirements.  Using 10G cards for these less-demanding elements is a great way to keep costs down.


It is also a good idea to look for vendors with scalable solutions.  For example, Emerson offers the 2-slot Centellis* 2000 in addition to its 14-slot Centellis 4440.  (See Figure 3.)  Having compatible platforms in different sizes makes it easier to meet the needs of different customers and different applications.  Many elements of an ATCA platform core are essentially unaffected by the platform’s size.  Thus, it is possible to re-use development efforts when moving between solutions of different sizes when using compatible platform bases.



Figure 3.  Emerson offers two 40G-ready platforms: the 14-slot Centellis 4440 (left) and 2-slot Centellis 2000 (right).


Finally, you should consider the partner ecosystem when selecting a 40G solution.  A comprehensive, tightly-integrated ecosystem can shave months off your development time by providing system management software, control and data plane software, and more.


The bottom line is that you can get a head start on 40G by making wise use of today’s 10G platforms.  For more reading on this topic, I highly recommend Emerson’s white paper Get Ready for 40G ATCA (registration required) and the RadiSys paper A Smooth Transition to 40G.  I also suggest Emerson’s recent blog 10G Deployments, 40G Available and new applications for ATCA.


So… are you ready to start designing for 40G?  If you’ve already started, what challenges have you encountered?  What solutions can you share with the community?  Let us know by providing feedback in the comments section below.


Emerson Network Power and RadiSys are Premier members of the Intel® Embedded Alliance.



Kenton Williston

Roving Reporter (Intel Contractor)

Intel® Embedded Alliance


Embedded Innovator magazine

Personalization has always been in vogue as a marketing ploy to promote everything from monogrammed clothes and jewelry to customized laptops and cell phones with special ring tones. Today, this trend is appearing in the health care industry as a strategy to encourage proactive health management. With Wii Fit and other game consoles that administer fitness training, iPhone apps that track caloric intake, and even PDA-like devices that rate products using genome computing analysis, users have a variety of options for personalizing their health care.


Providing an interconnected personal telehealth system wherein physicians can remotely prescribe treatments and patients can monitor their health status requires accurate, reliable information communicated via interoperable devices. This emerging priority represents the objective of the Continua Health Alliance, a nonprofit coalition of health care and technology companies collaborating to improve the quality of personal health care.


“Interoperability is crucial to spur the widespread adoption of connected personal health solutions, and thus it is at the core of Continua Health Alliance’s mission to build a framework for developing, testing, and implementing interoperable, connected health devices,” stated Rick Cnossen, Continua’s president and chair of the board of directors, during a recent Q&A.


Last year Continua released its Version One Design Guidelines for device manufacturers, test labs, and other companies involved in the Continua Certification process, and at this year’s Consumer Electronics Show (CES) demonstrated the first end-to-end connected health solution based on the Continua architecture. In the demonstration, data from a Continua Certified Bluetooth-enabled Nonin Medical wireless pulse oximeter (shown here) was sent to a PC manager running the Vignet Connected Health Services platform using the Continua device interface standard, then uploaded to an IBM server using the Continua WAN interface standard.




The group is currently developing the next generation of guidelines, which will include two low-power radio standards: Bluetooth low-energy wireless technology and ZigBee Health Care technology.


Intel was one of the founding companies of Continua and is active in the organization today, providing resources to working groups and collaborating with other member companies to establish an ecosystem of interoperable personal health systems, says Cnossen, who in addition to his role at Continua serves as director of Personal Health Enabling in Intel’s Digital Health Group.


Eurotech, an Associate member of Intel Embedded Alliance, also participates in the organization, working with Intel and other companies to provide products that support connectivity standards such as Bluetooth and USB, says Eurotech project manager Haritha Treadway.


“Being active in the Continua Health Alliance helps Eurotech be a contributing partner in meeting the interoperability requirements of nascent remote patient monitoring and other medical applications,” Treadway says.


When developing products for medical deployments, Eurotech uses standardized form factors such as COM Express, EPIC, and PC/104-Plus with support for high-speed communication interfaces like PCI Express and Gigabit Ethernet, Treadway says.


“Plug-and-play is a key aspect to solving the challenges of medical device interoperability with the need to support a range of standards and I/O options,” she says.


Offering an assortment of I/O and connectivity choices, Eurotech’s Helios edge controller platform with the Intel Atom Z5xx processor has been used to gather and transmit information from medical devices in a patient’s home. Through collaboration with Intel’s design and simulation teams, Eurotech develops its platforms using processes to ensure robust design with reliable signal integrity, EMI, and thermal performance, helping OEMs achieve FDA certification, Treadway asserts. The Intel Architecture aids in this process by providing solutions such as Intel Hyper-Threading Technology for increasing performance without sacrificing power consumption and Intel Virtualization Technology (Intel VT) for enabling multiple OSs and peripheral code to run in parallel, she adds.


Medical device manufacturers can incorporate multiple systems with Windows, Linux, or other OSs securely segregated from real-time systems using the LynxSecure real-time hypervisor and separation kernel from LynuxWorks, an Affiliate member of Intel Embedded Alliance.


“The use of today’s modern Intel processors with LynxSecure, which is optimized for Intel virtualization, provides our customers with unparalleled performance for multicore, multi-OS, multi-application medical systems,” says LynuxWorks director of business development George Brooks.


LynuxWorks’ family of OSs are built for safety-critical reliability and based on open-standard interfaces, with certified POSIX conformance allowing UNIX or Linux applications to run without change, Brooks says. These OSs have been used in high-end imaging, life support, and bedside patient monitoring applications, including a recent proof-of-concept platform that connected more than 25 wireless biometric sensors using a Portwell Mini-ITX board with Intel VT and LynxSecure to isolate the Bluetooth networking stack from other system software.


In addition to Mini-ITX boards, Intel Embedded Alliance Associate member Portwell offers other Intel-based products for medical applications, such as the NANO-8050 with support for Intel Active Management Technology (Intel AMT), enabling customers to remotely monitor units in the field, says Jack Lam, American Portwell Technology’s senior product marketing manager. Because these units are often used to diagnose and treat patients, system downtime not only represents lost revenue, but also risk to patients. Intel AMT provides easy troubleshooting over a network, thus shortening the time it takes to recover the system, he says.


These medical systems must be built with long life support and the ability to work with other equipment in the field, Lam says.


“Adapting new technology is just half of the equation,” he remarks. “Keeping it simple and designing it with common form factors and connectors to work with legacy units is the other half of the solution.”


The Continua Health Alliance is striving to make this process easier for companies by reducing the complexity of standards and providing events, training, and a rigorous certification program, Cnossen says.


“Companies are transitioning their products to target the broad, international, standards-based market and leveraging the tools and resources Continua has to offer to minimize effort and ensure high quality,” he says.


Better, cheaper, easier – these are the words you want to describe the equipment used to manage your health. Any other ideas for ways to improve medical device reliability and interoperability?


Jennifer Hesse

OpenSystems Media®, by special arrangement with Intel® Embedded Alliance

The digital signage market is constantly pushing for solutions with lower cost.  Today the focus of cost-reduction efforts focuses heavily on up-front hardware costs.  While up-front costs are certainly important, they represent only a portion of a system’s total cost of ownership (TCO).  To minimize TCO, signage developers and end customers alike must consider a system’s reliability, longevity, and manageability.  In this blog we’ll explain how these factors impact cost and show how solutions based on the Intel® Core™ 2 Duo help keep costs down.  We will also preview upcoming systems based on the new Intel® Core™ i7, Intel® Core™ i5, and Intel® Core™ i3 processors.


Let’s start by looking at reliability and longevity.  Today designers often control costs by using consumer-grade PCs to power their signage.   Although this approach can minimize up-front costs, it can lead to large expenses down the road.  Consumer-grade solutions are not designed for 24x7x365 operation, and fan failures and board failures are common.  Consumer-grade solutions are particularly vulnerable in harsh environments with heat or vibration issues.


Replacing failed components is expensive.   In addition to the hardware replacement costs, component failures lead to expensive field service.  Hardware failures can also lead to complications on the software side.  Consumer-grade solutions become obsolete at a frightening pace, so when a component fails it may not be possible to find an exact replacement.  Changing the components in a system means changing the drivers—and there is no guarantee that the new drivers will work smoothly.   Hardware changes are particularly risky if the system depends on specific hardware features, such as a specific HD video accelerator.


Hardware changes also mean that it is no longer possible maintain a single hardware build across locations.  Instead, locations with the original hardware will use one build, while locations with the new hardware will use a different build.   Maintaining multiple builds is a headache, and it adds to maintenance costs.


It’s also important to remember that digital signage drives revenue through ad sales, increased impulse purchases, etc.  An out-of-service sign cannot drive these transactions, so hardware failures impact revenues as well as costs.


One way to avoid these problems is to use embedded signage solutions based on the Intel® Core™ 2 Duo. These embedded solutions are designed for 24x7x365 operation, and are far more robust than consumer-oriented PCs.   For example, embedded solutions are typically fanless, eliminating a key source of hardware failures.  (Example fanless solutions include the AAEON AEC-6860 and the Nexcom NDiS 161 shown in Figure 1.)  If the hardware does fail, direct replacement is easy—embedded solutions are commonly available for 3 years or more.




















Figure 1.  The AAEON AEC-6860 is a fanless digital signage solution based on the Intel Core 2 Duo processor. 


Embedded signage solutions are also able to go into environments that consumer-grade PCs cannot handle.  Many embedded signage solutions are rated for extended temperature ranges and high humidity levels.  These rugged solutions allow digital signage to be installed outdoors and other locations that lack air conditioning.  For example, the Advantech ARK-1382-S2 is rated for operation at temperatures of 0-50 C and humidity levels up to 95\% (Figure 2).



























Figure 2. The Advantech ARK-1382 is rated for extreme temperature and humidity. 


Some embedded signage solutions are also designed to withstand shock and vibration.  These solutions allow digital signage to operate reliably inside trains, buses, boats, and other vehicles, as well as in vibration-prone locations like train platforms.  The Nexcom PDSV 6110 (Figure 3) is typical of these vibration- and shock resistant solutions.  This solution is designed specifically for in-vehicle applications.



























Figure 3. The Nexcom PDSV 6110 is designed specifically for in-vehicle applications. 


Now let’s turn our attention to manageability.  Embedded solutions that incorporate Intel® vPro™ technology—and in particular Intel® Active Management Technology (Intel® AMT)—can significantly reduce maintenance and management costs.  Intel AMT provides an out of band (OOB) communications channel.  This channel lets system administrators remotely diagnose, repair, and manage the signage—even when the signage is powered down.


Remote management provides a number of cost-saving benefits.    First, the diagnosis capabilities allow the system administrator to ensure that the signage is operating properly.   For example, Intel AMT can alert the system administrator if any of the required software is not running.  This capability ensures that the customer does not lose revenue from unnoticed malfunctions.


When problems do occur, Intel AMT gives the system administrator powerful recovery tools.  The system administrator can correct problem at the BIOS, OS, or application level, and can remotely reboot the system to a known-good state.  By addressing these issues remotely, Intel AMT reduces the need for expensive field service.   Intel AMT also reduces the need for field service by enabling remote updates of application software, OS, and BIOS firmware.  This feature also helps the sys admin maintain tight version control, ensuring consistent operation across installations.


Intel AMT also makes it possible to turn a signage installation on and off remotely.  This leads to considerable power savings—and thus cost savings—by ensuring that the displays are turned off after trading hours.


Remote management has a secondary benefit: By minimizing time spent traveling to deal with local issues, Intel AMT helps technical staff use their time more efficiently.  This in turn reduces the number of technical staff needed to maintain the signage.  For more information on Intel AMT and its benefits, I highly recommend the article Cut Power and Improve Manageability in Point-of-Sale Terminals and Kiosks.


To take advantage of Intel AMT, you must have Intel AMT-enabled hardware and software.  Intel AMT is supported in Intel Core 2 Duo signage solutions from vendors including Advantech and AOpen, while Intel AMT-enabled signage software is available from vendors like Ryarc.


Intel AMT will also be available in signage solutions based on the Intel Core i7, Intel Core i5, and Intel Core i3 processors.  The first of these solutions will be available this quarter from vendors like Advantech.  This hardware will offer lower significant improvements in graphics and CPU performance, while simultaneously reducing power consumption.  The performance improvements include dual-stream HD video decoding (as compared to single-stream decoding in the Intel Core 2 Duo chipset) as well as significantly better 3D graphics performance.  For more details, see my earlier blog on Intel Core graphics performance.


While we are waiting to see what’s next, I want to hear from you.  What strategies have you used to minimize costs?  Have you taken advantage of Intel AMT.  If so, how has it benefitted you?


Advantech is a Premier member of the Intel® Embedded Alliance.  AAEON and Nexcom are Associate members of the Alliance.



Kenton Williston

Roving Reporter (Intel Contractor)

Intel® Embedded Alliance


Embedded Innovator magazine

The complete Digital Surveillance System relies on several key technologies:


  1. Image capture using IP-enabled video cameras
  2. Image encode, often provided by newer generations of IP enabled cameras to minimize bandwidth requirements
  3. Image decode for playback
  4. Data storage and database analysis
  5. Image and scene analysis
  6. Threat/anomaly reporting


The first four requirements provide an updated technical equivalent of an analog Video Cassette Recorder. Technological advances provide smaller, lower cost cameras combined with vastly greater record times. What makes digital surveillance essentially different from previous recorders is the ability to perform unattended monitoring that alerts operators to discovered occurrences of defined events.


The long term goal of realtime computer monitoring of video feeds holds the potential to shift the balance towards gathering critical information in realtime with minimal human monitoring. Some types of video surveillance can be automated in realtime today using products such as huperLab’s huperVision 4000 and operates under Linux. The realtime system is used in retail stores to monitor for theft and shoplifting, identify consumer purchasing behavior, and watch employees for fraud and theft. Other uses of the system include: building and security checkpoint monitoring, facilities monitoring against vandalism, and student monitoring. 




The most visible portion of the huperLab solution is the video camera. Cameras are preferentially managed through an IP connection such as Ethernet.  Until recently, cameras for security systems were closed circuit analog devices that were hardwired or wirelessly connected through a proprietary protocol. Now digital cameras based on an Internet connection have exploded in popularity. A generic Network IP Camera is a stand-alone device which permits viewing live full motion video from anywhere over the Internet, using a standard web-browser. In the case of surveillance systems, connection to the camera is made via IP to receive the video data. Data analysis for surveillance is implemented in a dedicated surveillance-specific networked embedded system that is accessed via a networked connection.




High data rates, or at least high demand for bandwidth requires well crafted IP support.  There are a number of methods to achieve minimal overhead processing for IP communications. Real Time Operating Systems (RTOS) vendors often include TCP/IP support optimized for real time operation. Green Hills Software (1) not only provides a TCP/IP stack, they also offer a conversion service to move code developed for another vendor’s RTOS including pSOS, Nucleus, POSIX or Wind River Software’s (2) VxWorks to one supported by Green Hills. 


TCP/IP stack optimizations within software are fundamental improvement for network cameras, but purpose-built hardware for surveillance usually relies on hardware for managing TCP/IP connections. For example, Lanner uses up to five Ethernet cards based on the Intel 82574L Gigabit Ethernet controller. Altogether the card stacks can support 40 simultaneous high speed connections.


huperLab’s huperVision operates on a variety of embedded cards including the Gaia 404/408/416 board with Intel Atom processor. Available cards support PCI and PCIexpress card formats. hupeLab also employs high quality video/audio capture chips to improve throughput performance. The Gaia board is the smallest and lowest power server board designed for security surveillance use. The Gaia Server Board based surveillance system reduces electricity expense, consuming less than 60 watts system power.


huperLab is one solution to video surveillance. Intel’s Technology Journal offers a PDF document that includes surveillance as one computer vision application. Titled “Computer Vision Workload Analysis,” the article provides basic computer vision technical information which is just as leading edge as when it was first published, just a few years ago. The document introduces a variety of topics that bear on video surveillance, but doesn’t incude all of the features present in current digital surveillance products . Open Source software offers an alternative for portions of video surveillance software. Sun (3) provides Open Source video surveillance software focusing on the storage aspects of the system. Twenty four other freely available software packages for portions of video surveillance functionality  include:


SecureCam for Windows

Electric Eye for Linux

iDVR Video Surveillance System

OpenSurveillance web-based software running within a browser

Carnegie Mellon University maintains a web site dedicated to computer vision topics. The compendium of sources includes the CMU source library.

Reading People Tracker

Halcon available library for image analysis

Intel Open Image Processing library



Combining parts of several of these open source packages with your own code may provide a more complete digital surveillance system, but combining disparate software sources is a difficult task. These software sources are probably best as an aid in gaining experience with the technology. You may find that one of these packages meets your objectives. But, a caution when dealing with these open and free sources: video surveillance technology is changing rapidly and these sources may not be current or complete.

Technically, video surveillance is commercially run on an ATOM, Core 2 Duo, and Core 2 Quad embedded processors.  More and faster processors don’t necessarily mean faster application execution. Video processing is one application that benefits from the Intel QuickPath Technology. QuickPath provides additional paths between memories. Each core can access shared memory and local memory. As can be seen, the memory controller offers better and faster interconnect between processing and I/O blocks.




Computer ethicists have long expressed concerns and misgivings over the intrusion into personal and private actions. But the uses discussed for automated video surveillance have been in settings previously decided by Courts around the world to be permissible venues for video surveillance. Some professionals believe that computerized video surveillance improves privacy by replacing the continuous monitoring by people with an algorithm operating on an embedded system.   In effect, automated video surveillance simply alerts human operators to events that meet criteria requiring their attention.


Where can you use video processing techniques in your next projects?




(1) Wind River Systems is an Associate Member of the Intel® Embedded Alliance

(2) Green Hills Software is an Affiliate Member of the Intel® Embedded Alliance

(3) Sun is an Associate Member of the Intel® Embedded Alliance


Henry Davis
Roving Reporter (Intel Contractor)
Intel(r) Embedded Alliance

Many of us are familiar with those interactive displays at museums where you press a button and different bulbs light up to show you something like the path an explorer took to discover a new country or an arboreal animal's habitat in the forest canopy. Frankly, I'm never that impressed with those exhibits. I need a lot more "wow" to get me interacting with a display. Which is precisely why I got excited the other day while talking to the folks over at Micro Industries, an Affiliate Member of the Intel® Embedded Alliance. CEO Michael Curran was telling me about the interactive experience at the DuPont Environment Center in Wilmington, Delaware. Interpret Green, an interactive exhibition firm, chose one of Micro Industries' Touch&Go messenger* 65L dynamic media displays positioned flat like a table to give visitors an interactive aerial view of their grounds. Touch an active spots on the display and information would appear.


It was one of those "aha" moments. My mind started cranking on how something like that could be used, and in most cases, Interpret Green had already thought about it. It's really a simple concept. You take a large touchscreen digital sign/computer and lay it flat on a stand. Suddenly, the world changes. People can gather around it and start touching places in the screen to make things happen. In a natural history museum, you might touch a pyramid and get a video tour just in that area of the screen of the pyramid's interior. Someone else on another end of the "table" could simultaneously touch an image of a mummy and see a video of the mummification process.


In a mall or airport, an interactive map could do much more than the standard "you are here" static map. In addition to showing where stores are in the mall or various gates in an airport, it could pop up information on specials or videos of the interior of the store that give you an idea of its merchandise. For mall or airport restaurants, photos of sample dishes or PDFs of menus could be displayed.


Other uses quickly come to mind as well. For a theme park, pictures or videos of the each ride could pop up when you touched their location on a map. In an arcade, a horizontal touchscreen could be programmed as a game board for multi-player games.


This is big thinking and it gets bigger. Michael told me Micro Industries has introduced an even larger Touch&Go messenger*, the 82L and 82P (the "L" stands for "landscape" and the "P" for "portrait."). What's more, these new units come with plenty of processing and graphics performance. They feature the new embedded version of the Intel® Core™ i7 processor with its integrated next generation graphics engine. Bottom line, the Touch&Go doesn't need a third-party graphics card for dazzling, eye-catching high definition (HD) graphics, video and 3D. It's built-in.


The integrated graphics can be a big point when it comes to power. A graphics card can easily consume 150 watts. If you're looking to run a computerized display day in and day out, being able to skip the card significantly reduces power consumption. If you're running a number of these interactive tables spread out over a facility or many locations, the savings really add up.


Micro Industries also has another way for you to minimize ownership costs—remote management. By utilizing Intel® vPro technology (specifically its Intel® Advanced Management Technology, or Intel® AMT, component),[1] these systems can be maintained and managed remotely for reduced total cost of ownership (TOC). They can even be shut down and turned off remotely to increase energy savings.


Interestingly enough, Intel is impressed enough with the Touch&Go messenger series that Intel EVP Sean Mahoney used one in a demonstration at the Fall 2009 Intel® Developer Forum in San Francisco (see video). One of the features he demonstrated was its ability to incorporate a camera and video analytics to determine a viewers demographic (age, sex) and bring up messages or other information likely to be pertinent to you.


What's your opinion on large table-format touchscreen interactive maps? Are they coming to a museum or mall near you?


[1] Intel® Active Management Technology (Intel® AMT) requires the computer system to have an Intel® AMT-enabled chipset, network hardware and software, as well as connection with a power source and a corporate network connection. Setup requires configuration by the purchaser and may require scripting with the management console or further integration into existing security frameworks to enable certain functionality. It may also require modifications of implementation of new business processes. With regard to notebooks, Intel AMT may not be available or certain capabilities may be limited over a host OS-based VPN or when connecting wirelessly, on battery power, sleeping, hibernating or powered off. For more information, see




In medical imaging, a picture is worth a thousand words and several gigabytes of data. MRI and CT scans, mass-spectrometry, phenotyping, and genetic studies generate hundreds of terabytes of data that must be processed and stored by powerful supercomputers, like the Intel Xeon-based SGI Altix UV being used to support the efforts of the Institute of Cancer Research in England.


Companies that manufacture medical imaging equipment rely on embedded technology vendors to deliver high-performance components that won’t break down prematurely. This type of equipment (such as the GE Healthcare CT750 HD computed tomography scanner pictured below) requires massive computing power to manipulate data and project it as Picture Archive and Communication System (PACS)-level images for radiologists to read on high-definition displays, says Clayton Tucker, senior marketing manager at Emerson Network Power Embedded Computing, a Premier member of Intel® Embedded Alliance.


“The importance of high performance becomes more significant in direct correlation with the discernment of the soft tissue and organs at almost a molecular level,” Tucker says. “Slices in CT have gone from 16 to mind-boggling 256 or more today. That is a tremendous amount of information being deconstructed and reconstructed in real time.”




Longevity is equally important as high performance in embedded designs for medical systems, as medical equipment manufacturers want their end products to sell for 5-10 years to recoup ROI and comply with health and safety regulations.


“The average consumer pays little attention to ever-evolving computer technology, except to perhaps complain at the rapid outdating of their home PC,” says Jason Wallace, marketing manager at Corvalent, an Affiliate member of the Alliance. “For manufacturers requiring extensive certifications that necessitate long time to market, the continual evolution of the industry and constant discontinuation of old technology for new can spell disaster for a corporate bottom line.”


To help ensure high performance and long-term operation, Corvalent “overengineers” its motherboard products using a variety of processes, from conformal coating to board-level design changes, Wallace says. For example, the Corsys-M10 Medical Tablet with an Intel Core 2 processor is designed to withstand common hospital chemicals and significant physical abuse for reliable performance in its intended environment.


All of Corvalent’s long life-cycle industrial motherboards utilize Intel technology, allowing the company to supply medical equipment manufacturers with products that last 7-10 years and thus won’t be discontinued multiple times before the end product hits the market, Wallace says.


ADLINK Technology, an Associate member of the Alliance, also uses the Intel architecture in its long-life products for medical applications, such as the CoreModule and NuPRO series of SBCs, says ADLINK channel manager Alan Wells.


“With Intel’s seven year backing of embedded products that allows customers a five-plus year ROI, that, in turn, lowers their total cost of ownership,” Wells says.


Alliance Premier member Kontron likewise leverages the Intel embedded roadmap and partners with strategic suppliers to extend the life of its products for medical environments, such as an ETXexpress baseboard and ETXexpress-CD video frame grabber used in dental imaging systems, said Jack London, product manager with Kontron America’s Embedded Modules Division, during the Q&A session of a recent webcast.


“We are seeing faster turnover because of the rapid pace of change in technology, and for this reason, we go with Computer-On-Module (COM) technology so the computer can scale with the application as processor performance improves,” London remarked.


Emerson also offers COM Express modules as well as MicroATX motherboards such as the MATXM-CORE-411-B, all of which use the latest Intel Core i5 and i7 processors to tackle demanding 3D and 4D ultrasound applications. Manufactured using 32-nanometer process technology, these processors provide medical equipment makers with significant increases in performance and energy efficiency, Tucker says.


Several chips in the Intel Core processor family come with Intel vPro technology, delivering hardware assistance for security and management functions that are particularly important in medical equipment. For example, Intel Virtualization Technology, which enables divergent applications to run in parallel on a single device, minimizes software overhead in compute-intensive medical devices by moving much of the burden of software-based virtualization into hardware, while Intel Trusted Execution Technology guards against software attacks and protects the confidentiality and integrity of vital patient data. These and other technologies combine to provide intelligent performance in reliable, trusted, and cost-effective platforms, Tucker asserts.


“Medical institutions are dealing with an increasingly complex plethora of networked embedded devices in the clinical environment,” he says. “The remote management capabilities provided by Intel Active Management Technology can help them contain rising support costs by querying, fixing, and protecting networked embedded devices even when they’re powered off, not responding, or have software issues.”


With lives depending on these networked medical devices, the need for accurate, reliable, interoperable embedded designs becomes imperative. This series will wrap up next week with a look at how members of the Continua Health Alliance and Intel Embedded Alliance are working together to fulfill these critical requirements.


As far as “big iron” (MRI, CT, and other expensive diagnostic imaging) medical equipment is concerned, what products does your company offer that can meet these applications’ needs for high performance and long life?


Jennifer Hesse

OpenSystems Media®, by special arrangement with Intel® Embedded Alliance

There are a number of weather reporting desktop and mobile software packages available for free or at a low cost. But this software relies on a Windows Operating System as the hosting system. The requirement for Windows gives a small number of alternatives for realtime systems. Some vendors offer realtime operating systems and hypervisors that permit guest operating systems to run concurrently with other OSes. TenAsys (1)  supports their hard realtime OS, INtime RTOS, and Microsoft®'s (2) Windows® OS. Real Time Systems GmbH (3) RTS's Hypervisor provides partitioning for standard OSes, including an RTOS, and a non-realtime OS for user interface applications. RTS supports as one of the guest operating systems Microsoft Windows XP.


Generally, selecting a single vendor for a software platform is a simplifying decision. But in this case we need to be certain of the platform’s ability to support both a Real Time Operating System (RTOS) and the user interface code. For this system I’ve chosen the TenAsys INtime RTOS and Microsoft Windows OS. The TenAsys virtualization platform provides a virtual machine that hosts real-time and embedded operating systems running alongside Microsoft® Windows®. It also provides communication channels to link embedded and Windows applications.


The INtime kernel and realtime processes always have higher priority than any Windows process. This ensures that the realtime portions of the system are provided time to complete. The INtime operating system features a proprietary OS Encapsulation Mechanism (OSEM) that creates and separates the two virtual machines: one for the Windows operating system and one for the INtime RTOS. Once encapsulated, Windows executes as a single, low priority, real-time thread in the context of the real-time root process. However, software doesn’t work without hardware.


Intel recently sent me an ATOM demo system that is really cool. I’ve written about my initial impressions on, and as a preview – I’m impressed!  The demo kit is not an optimized or purpose built hardware platform. But it makes a ready-made prototyping tool that allows systems decisions and development to proceed before final hardware is available. With the TenAsys INtime RTOS and Intel’s demo system, we’re ready to start software development. Fortunately since we’re able to split environments for realtime from user interface and legacy code, we can use normal Windows development tools for that portion of the development.  ATOM software development tool chains have been covered in a previous blog.


Windows weather monitoring software includes,, and all of which offer free access to basic weather data. Different systems contain different amounts of historical data. One rich source of data is the US Government via All of these services provide near term forecasts, some for one week and others for 10 days or more.   In addition to providing historic and forecast data, access to web-based interactive tools permits users to explore the facts of weather and forecasts as they desire. This is one of the unexpected strengths of modern embedded systems that incorporate general purpose operating systems primarily to operate complex user interfaces.


Historic weather data is interesting and can be incorporated into the control system for irrigation. But realtime or near realtime data is more important for controlling turn on and off schedules for individual water valves.  One control algorithm for lawn watering simply turns off the water valve when it starts raining.  More sophisticated controls rely on moisture sensors that measure moisture content of the soil such as Onset’s sensor. Although at first glance it seems to be a simple task, soil monitoring remains a research topic for computer scientists. Also read.


All of the weather data that we’ve discussed so far has relied on an Internet connection. Fortunately the ATOM kit includes an Ethernet port. Providing that there is a local Internet connection, we have access to as much historical data as we want.


It seems like a lot of work and expenses to save a little water. The water savings can be deceptive. I live in the American South West and rely on water hauled by truck to fill my water tanks. Water costs me 3 cents per gallon. My first attempt at irrigation control reduced water usage during the summer from 8,000 gallons per month to 5,000 gallons, or a reduction of about $90 per month. Computerized timing of watering for precise quantities, on schedule, and at more effective time made a huge difference. But the irrigation control was open loop and was made using simple timers. My next refinement was to close the loop. Closing the loop can take several approaches: direct monitoring of the amount of water delivered, measuring water stress in plants, or precision measurememts of the amount of water present in the soil.


To give precise measurement of soil moisture, we use Time Domain Reflectometry. The water content reflectometer can be used to monitor soil water to calibrate less expensive sensors, and provide base measurements to evaluate our system. TDR measures the volumetric water contents by using a TDR based on the dielectric constant of the soil. Two to four probe rods act as wave guides. The dielectric constant of the soil surrounding the rods varies with the amount of water in the soil. The quality of soil moisture measurements is also affected by several other factors, such as soil electrical conductivity, salinity, clay content, and soil compaction, and thus the calibration has to be modified based on locally effects.

The ATOM prototype demo system can be used both to pulse/measure the probe data and calibrate the TDR soil probes in real time. This allows less expensive measurements to be taken because we do not require a self contained piece of electronics.  TDR works in soil moisture measuring by accurately determining the permittivity (dielectric constant) of a material from wave propagation. There is a strong relationship between the permittivity of a material and its water content. TDR for soil moisture measurement generally employs phase shift measurement. TDR works in a fashion similar to RADAR. An electrical impulse of from 2ns to 20ns in width and a sharp rise time is impressed on a wire to the probe. The reflected waveform is analyzed to determine the phase shift. By using a commercially available TDP soil sensor, we can choose to sample the probe at a 200kHz rate. The samples need to be averaged in order to achieve reliable results. Within the irrigation control system, controlling the sensors is a realtime operation that is best performed in a hard realtime fashion. Interfacing between the ATOM development board and the sensor is through the development kit’s parallel port. Most commercial sensors may have from 8 to 64 sensors multiplexed on a single parallel port.


Although the choice of a commercially available moisture sensor reduces the amount of time used to read the sensors, there is still a requirement for high performance during the reading process. The return impulse ranges from 16 to 32 microseconds from the impulse impressed on the probe. This allows the ATOM processor to measure the time in a software timing loop. The basic systems time for at ATOM processor is 1.6MHz, which permits a 10 ns resolution for determining the return time of the pulse. During this time, the timing thread must be of the highest priority and be non-interruptible. Resolution of the sensor/timing loop combination may be extended by repeated readings of the sensor. Each reading is added to the previous reading and the radix point finally shifted to the right for every power of two reading that were averaged. So if four samples are added, the radix point is shifted two places to the right to arrive at the true value.


Although not a complex correction, the data from TDR sensors need to be adjusted by a quadratic equation to achieve a corrected measure of water content. A parametric correction for soil types is a more complex correction, but significant in computational complexity.

To determine the time remaining to the next watering cycle, we need to develop a rate of change for the water content. Some agricultural experts use a 5 minute sampling interval for reading the water levels. This number makes sense because in many locations the soil temperature and wind speed can dry the soil fairly quickly. With a number of soil moisture samples available, we can calculate the rate of change and consequently predict the time before the next irrigation cycle.   


Assume that we have four soil probes, then every 5 minutes the critical sampling periods would consume about 1 milliseconds in non-interruptible time. The remainder of realtime tasks are in the category of soft realtime. For example, individual valves used to supply water to irrigation lines are a latching valve that once turned on, re,aim on until tirned off. Controling these valves is a soft realtime task because there is no harm in being say a few seconds late in turning on the valves.


Alternatively, microsensors are in experimental development at Cornell University remote sensors in fields employing wireless communications to transmit data to a central computer. The sensor is an analog of a plant vascular system that mimics the flow of water in plants through the use of microfluidic tubes. The advantage to using such a sensor when fully developed, is that irrigation should be initiated when the plant shows signs of water stress. When operational the sensor will communicate via a wireless connection, such as WiFi. The ATOM development kit supports up to six USB ports which will allow a transceiver for the microsensors to transmit information to the ATOM kit.   


ATOM makes an ideal processor to be used in similar situations because it supports realtime and Windows legacy code in a high performance configuration that can consume small amounts of power. In this article we’ve discussed one application of realtime software combined with user interface code. In this application there are wide variations in processing power required: high performance for relatively short duration followed by ultra-low power consumption between monitoring and controlling realtime watering with occasionally needs for intensive user interface usage.


What applications can you think of that needs realtime control combined with sophisticated user interface?




  1. TenAsys is an Affiliate Member of the Intel Embedded Alliance
  2. Microsoft is an Associate Member of the Intel Embedded Alliance
  3. Reatime Systems GmbH is an Affiliate Member of the Intel Embedded Alliance

Henry Davis
Roving Reporter (Intel Contractor)
Intel(r) Embedded Alliance

While health care insurance and public policy spark heated debate, it’s difficult to dispute the value of medical devices and equipment that can minimize operating costs and improve patient care. The burden on medical device developers is to provide advanced technologies that fulfill the health care industry’s need for fast, secure, convenient solutions certified to strict rules and regulations.


In a market exploding with growth driven by trends such as an aging population, shortage of health care professionals, skyrocketing costs, and federal incentives to digitize health records, several Intel® Embedded Alliance companies are taking advantage of the momentum and developing innovative products optimized for medical applications.


This blog series will highlight a number of technical challenges specific to medical device design and explore how Alliance member companies are resolving these problems using embedded Intel technology. Beginning with this post, the series will break down into three segments exploring interrelated requirements:


  1. Portability, small size, low power consumption
  2. Fast, powerful performance and long life support
  3. Accuracy, reliability, interoperability

To start things off, the considerations of portability, small size, and low power consumption are becoming more critical as the medical community increases its mobility. In a recent report, Gartner identified mobile health monitoring as one of the top 10 consumer mobile technologies for 2012, citing its potential for reducing costs and improving quality of life for patients.


Mobile health care calls for portable medical systems that must be small, lightweight, and able to be unplugged and powered by batteries, which last longer if the CPU uses less power, says Jennifer Zickel, COM Express product line manager at RadiSys, a Premier member of the Alliance.


“Active fans for thermal dissipation cannot be used in portable or cart-based equipment, which must be passive for reliability, so this drives the processor requirement for low power,” she says.


Further complicating matters, patient monitoring instrumentation is rapidly changing from a basic portable unit to a networked, multiparameter product with advanced graphics and touch-screen capabilities, Zickel adds. This consequently requires the integration of wireless connectivity, which must be reliable and secure to comply with HIPAA and other regulations, as pointed out by Joseph Chung, medical product manager at Advantech, another Premier member of the Alliance, during a recent webcast. Designers must include support for Wi-Fi and Bluetooth technologies to give health care professionals access to vital information at the point of care.


“Nurses want to have a lightweight, compact product that they can carry around easily and use beside a patient’s bed,” says Lydia Cheng, also a medical product manager for Advantech.


Cheng explains that compact design and low power consumption were the two major requirements for Advantech engineers developing the MICA-101, a medical tablet PC based on the Intel Mobile Clinical Assistant (MCA) reference architecture. Using the Intel Atom Z510/Z530 processor helped the design team meet these goals by providing 1.1/1.6 GHz performance with 2 W max Thermal Design Power (TDP). Products based on the Intel MCA reference design, such as the Motion Computing C5 pictured here in use at UCSF Medical Center in San Francisco, give health care providers a consolidated view of patient data in a convenient platform optimized for clinical environments.



Alliance Affiliate member Arbor Technology also bases most of its medical products, including the M1255 medical tablet PC and M1726 bedside infotainment computer, on the Intel Atom microarchitecture to satisfy the considerations of fanless and mobile computing, says Arbor market development manager Kevin Huang.


“For medical application and hospital environment requirements of low noise, anti-splash, disinfection, ruggedness, and mobility, the Intel Atom processor can help us have a better mechanical design and the best performance/watt to meet these requirements,” Huang says.


Besides portable nursing carts and bedside computers, medical applications often involve information systems that call for higher-performance platforms powered by dual-core processors like the Intel Core 2 Duo, says Joey Hsu, product manager for the eService Platform Division of Avalue Technology, an Affiliate member of the Alliance. But for small, low-power applications, solutions such as Avalue’s MTP-1503 bedside infotainment station and MTP-1203 modality gateway to hospital information systems, both with the Intel Atom N270 processor, provide the low power consumption needed in thin client terminals, Hsu says.


“Low-power systems or devices will prolong power endurance to avoid the trouble of back-and-forth recharging as devices are used on the go,” he remarks.


Small form factor boards using Intel processors are particularly suited to mobile medical applications, as they offer low power consumption, integrated graphics, Ethernet, and support for a variety of operating systems, Zickel says.


“COM Express modules enable medical equipment I/O to be customized while giving the equipment a wider processor and power selection to meet market demand,” she says. “Motherboards provide a fast time-to-market but more generalized approach, and SBCs provide the smallest and most cost-sensitive solution to medical equipment needs.”


RadiSys’ Atom-based COM Express and Pico-ITX products such as the Procelerant Z500 and PCIOZ500 have been successfully deployed in patient monitoring and portable ultrasound equipment, Zickel says. While size and mobility are crucial considerations in these systems, other medical applications involving complex image processing and other high-level functions demand more attention to performance and endurance. Next week’s post will delve into requirements for fast processing and long life support and discuss how Intel Architecture technologies are enabling Alliance members to resolve these design issues.


With all the buzz about health care being generated today, what kinds of opportunities are you seeing for embedded products in the medical device market? How is your company meeting the needs of small size, portability, and low power for medical devices?


Jennifer Hesse
OpenSystems Media®, by special arrangement with Intel® Embedded Alliance

Filter Blog

By date: By tag: