Picking and sorting of objects is an activity few humans look forward to with great elation. However, the tedious nature of the task is a prime candidate for automation using robotics. Beyond the obvious hardware that it takes for a robot to operate within the confines of a designated task, a less obvious one, machine vision, acts as a critical component of efficient robotic sorting. The technologies involved with machine vision, sensing, and object interaction are already being used by robots with great success on the International Space Station in completing even complex tasks, semi-autonomously.

Robotics labs around the world are hard at work refining the technology for applications in factories, warehouses, and even relief efforts in disaster areas. The environment in each example is likely one with an abundance of clutter as well as rife with objects of varying size, weight, and orientation. This is a perfect setting to test and apply the advancements of machine vision and object interaction.

a picture of machine vision in action on an assembly line.

Machine Vision Is Increasingly Relied Upon In Automation To Drive Quality And ROI. The Market Is Expected To Reach $18.7 Billion By 2022.

Components of Machine Vision

Machine vision has significant capabilities on factory floors and production lines. As systems acquire product images and extract the relevant information, the information is analyzed and communicated to the outside world. A lot goes into the technology behind machine vision, which can be broken down into five essential components: lighting, lenses, vision processing, image sensing, and communications.

Machine Vision Lighting

Lighting is essential to the success of machine vision results. By capturing images through analysis of reflected light, machine vision systems can effectively identify objects as well as their orientation in an environment. Several lighting techniques can be utilized by machine vision systems, including backlighting to measure external and edge measurements, structured lightning patterns to interpret angles on an object’s surface, and strobe lighting to freeze moving objects for examination or aid in countering blurring. These are only a few examples of the lighting techniques utilized in machine vision systems, which can also incorporate diffuse dome, bright-field, dark-field, and axial diffuse lighting. A more comprehensive guide to machine vision lighting can be found in this whitepaper from National Instruments.

Machine Vision Lenses

Just as in conventional cameras, lenses capture an image and deliver it to sensors within the camera. One can also think of this in terms of our eyes delivering the images we see to our brains for interpretation. Fixed and interchangeable lenses are the most common types of lenses in machine vision systems. Lenses of varying sizes and shapes are used to capture the most precise image for the system’s intended use. Fixed lenses are typically standalone components and can autofocus based on mechanical adjustment or as a fluid lens that automatically adjusts to deliver the highest quality. These lenses have a fixed field of view from a certain distance. On the other hand, interchangeable lenses are typically equipped with C-mounts or CS-mounts that allow them to be removed or attached at will to the systems they are enhancing. Vision Systems Design does an excellent job detailing the fundamentals of machine vision lenses in this article.

Machine Vision Image Sensors

An essential component of image capture, image sensors utilize a charged couple device (CCD) or a complementary metal oxide semiconductor (CMOS) to interpret light as electrical signals. In more easily digestible language, image sensors capture the reflected light from an image and make sense of the object, interpreting it as a digital image with precise details that aid in accurate measurements by processing software. A more comprehensive article on image sensors from Coventor can be found here.

Vision Processing Units

A Vision Processing Unit (VPU) is another component of machine vision that serves to extract information from the digital images captured by the cameras being used. The processing undertaken by these microprocessors can be completed externally or internally on a standalone system. A process completed over the course of several steps, images will first be acquired from the sensor and software will identify specific features of an image, including measurements and comparisons to reach a decision based on the result. The results are then communicated to the system to complete additional actions. While it is true that the physical components of machine vision are integral to the overall function of these systems, the processing algorithms that evaluate and compare results are the most influential. Processing software is responsible for configuring camera parameters, pass-fail detection, communicating information to factory floors, and supporting Human Machine Interface (HMI) development.

Machine Vision System Communications

As one might conclude from this brief overview of machine vision, these systems are an amalgamation of parts and components that must all work in unison to deliver accurate results consistently and in real-time. Add to this the fact that environments can change dynamically and at a moment’s notice raises necessity to realign vision on the fly. Communications are an essential component of machine vision systems and are typically executed through discrete I/O signals or via data delivered through a serial connection to a connected device, i.e. Ethernet or RS-232 output. Programmable logic controllers (PLC) are the favored connection made to discrete I/O points, which can control stack lights, solenoids or other indicators to trigger reject responses from the system.

Machine Vision Applications In Production Processes

As mentioned above, machine vision is already hard at work on the ISS. However, more candid applications exist right here on Earth. Industrial inspection is among the largest industries utilizing machine vision. However, many applications, such medical imaging, remote sensing, and autonomous vehicles use machine vision on a continuous basis.

Sorting

Machine vision applications in warehouse and factory conditions are exceptional at mitigating the amount of human error that can affect repetitive processes, such as bin-picking and sorting tasks. The technology allows for robots to make sense of a cluttered workspace or full bin and extract the relevant objects appropriately.

Assembly Verification

When quality is of the utmost importance, consistent outputs on the assembly line are necessary to a company’s bottom line. Inspection operations in assembly verification are completed in milliseconds to ensure that every item is up to spec and incomplete products don’t make it past the check.

Automation

Automating production is an essential function of manufacturing operations. Machine vision can assist in detecting system abnormalities, jams, and other hiccups that can affect the production process. Improving the consistency of operations ensures little interruption and a reduction in production costs that result from downtime.

Removing Defects

Automation demands two things: consistency and simplicity. When things get complicated they tend to also get expensive. Machine vision systems are capable of inspecting hundreds of items per minute to a high degree of accuracy, thus ensuring that defective items are removed from the equation before they can affect a business’ bottom line, without the need for complex systems of checks throughout the production process.

Identification

The ability to scan barcodes and other identifying under several difficult conditions, be it lighting, texture, or packaging, is essential to keep operations running smoothly. Machine vision systems help achieve optimum efficiency in quickly reading necessary labeling information on the production line and in distribution centers.

About Encompass Solutions

Encompass Solutions is a business and software consulting firm that specializes in ERP systems, EDI, and Managed Services support for Manufacturers and Distributors. Serving small and medium-sized businesses since 2001, Encompass modernizes operations and automates processes for hundreds of customers across the globe. Whether undertaking full-scale implementation, integration, and renovation of existing systems, Encompass provides a specialized approach to every client’s needs. By identifying customer requirements and addressing them with the right solutions, we ensure our clients are equipped to match the pace of Industry.


Space Robots

With the successful launch and reentry of Tesla’s Falcon Heavy rocket, now is an excellent opportunity to talk about space robots, machine vision, and their roles in expanding space research and exploration.

 

a still of the starman mannequin and tesla roadster with earth in the background from the SpaceX.com live feed..

A Stunning View Of Earth Captured Following The Falcon Heavy Launch. Photograph: SpaceX.com Live Feed.

 

Space robots. Emulated after us in terms of morphology and size, they are superior to industrial robots when it comes to versatility and capability. While right now they may not look as advanced or operate as nimbly as their representations in sci-fi features from the Star Wars and Star Trek franchises, that gap is quickly shrinking. Taking on repairs and other tasks deemed too dangerous for astronauts, these specialized robots are the obvious candidates for many of the precarious activities taking place beyond the relative comfort of Earth.

Space Robots: R2 Goes To The International Space Station

The first humanoid robot in space, Robonaut 2, R2 for short, was developed by the Dextrous Robotics Laboratory at Johnson Space Center (JSC) in Houston, Texas. R2 emerged earlier this decade as the latest subject of robotics research in space. Originally consisting of an upper only torso and arms, R2 has now been equipped with two climbing manipulators, read as legs, capable of providing mobility in zero-g environments to complement dexterous arms and digits that handle intricate tasks. R2’s evolution is a marvel for researchers and enthusiasts to behold, but what’s more impressive than the achievements made over its predecessor, R1, are the advanced sensing capabilities that allow R2 to truly perform in one of the most challenging environments imaginable.

Space Robots: A picture of Robonaut 2 aboard the international space station.

Space Robots: Robonaut 2 Working Tirelessly Aboard The International Space Station.

Machine Vision, Sensing, And Perception

The abilities to touch and see are perhaps the most extraordinary components of these robots’ capability. Vision and sensing components relay complex sets of data such as the identity, position, and orientation of objects in an image. Powerful industrial machine vision and process guidance systems are allowing next-generation robots the ability to evaluate and react effectively in real-time.

Without the component of machine vision, robots are little more than extensions of their controllers and the setpoints governing automated tasks. In R2’s case, 3D vision is the component of machine vision that allows it to perform complex tasks in a semi-autonomous fashion. R2 is both capable of remote control by operators and semi-autonomous operation using advanced software that lets R2 “think” of the solution to a given task. Software updates regularly expand the depth and breadth of R2’s capability. R2’s vision is governed by five cameras in all. Two to provide stereo vision for the robot and its operators, and two auxiliary cameras for backup use. The component of stereo vision allows images from two vantage points to be compared, effectively allowing R2 – and us – to see in 3D. A fifth infrared camera is contained within the mouth area to aid in depth perception. All vision components are housed within the cranium, while R2’s “brain” is located within the robot’s torso. R2 can look up and down, left and right, to fully gauge its surroundings.

 

Space Robots: a picture of R2 with legs attached

Space Robots: R2 Equipped With Two Climbing Manipulators, Read As Legs, Capable Of Providing Mobility In Zero-G Environments.

 

A prime example of cooperative robotics at work, R2’s ability to interact with the astronauts on the ISS mimics the way another person might. Operating at a pace relative to humans, R2 has a soft, padded skin that is equipped with sensing systems that allow it to react when encountering a person. Force control is provided by torsion springs inside the robot and allow R2 to react to influence from the environment. So, when a person pushes away an arm, R2 gives to the motion and lets the person by. This sensing capability also provides R2 with continuous awareness of its orientation and the location of its limbs relative to the environment and surrounding people.

Object Interaction

As for Robonaut’s interaction with its environment, its hands work a bit differently than both humans’ and industrial robots’. The key difference resides in R2’s tendon-driven robotic fingers. Typically, robots will control their joints via tension controllers located on each tendon individually. Putting it another way, joint torque translates into tendon tension.  This poses a problem in the case of R2. The resulting disturbances between joint displacement and the tendon had to be addressed for R2 to be able to interact with unfamiliar objects in an array of positions. This is in stark contrast to R2’s industrial cousins, which operate in uniform spaces with familiar objects. The solutions to R2’s dilemma came when NASA and GM engineers devised a joint-based torque control method that decouples the tendon. All this talk about torque is of particular importance for R2, as well as many other humanoid-robots, due to the necessity for adaptable grip when interacting with a variety of objects large and small.

Space Robots: A picture of Robonaut 2 holding different tools

Space Robots: Robonaut 2 Is Capable Of Working With An Array Of Tools. Photographer: Robert Markowitz.

What’s Next For The ISS And Non-Human Crewmembers

The most recent iteration of Robonaut coming from Houston’s JSC is R5, or Valkyrie. Built to compete in the 2013 DARPA Robotics Challenge (DRC) Trials, the design of Valkyrie took place over a 15-month period and improved electronics, actuators, and sensing capabilities based on earlier generations of JSC humanoid robots. In particular, R5’s vision and sensing system improvements are a tremendous advancement over than those found in R2. Valkyrie’s redesigned head sits on a neck possessing three degrees of freedom and features a Carnegie Robotics Multisense SL, a tri-modal (laser, 3D stereo, and video), high-resolution, high-data-rate, and high-accuracy 3D range sensor, as the main perceptual sensor. Additional modifications include infrared-structured light point cloud generation beyond the laser and passive stereo methods already implemented, as well as front and rear “hazard cameras” positioned in the torso.

Space Robots: A picture of robonaut 5 with hands on hips.

Space Robots: The Latest Iteration Of Robonaut, Robonaut 5, Is Also Referred To As Valkyrie And features The Latest Tech In Robotics For Space Applications.

As research advances technology here on the ground, components and software can be sent to the ISS for utilization. Once proven to operate effectively on the ISS, NASA and other robotics laboratories hope that innovative robotics and associated technologies can be applied further in the depths of space. In the future, thermal resistance for robots will likely be a main focal point for researchers.

About Encompass Solutions

Encompass Solutions is a business and software consulting firm that specializes in ERP systems, EDI, and Managed Services support for Manufacturers and Distributors. Serving small and medium-sized businesses since 2001, Encompass modernizes operations and automates processes for hundreds of customers across the globe. Whether undertaking full-scale implementation, integration, and renovation of existing systems, Encompass provides a specialized approach to every client’s needs. By identifying customer requirements and addressing them with the right solutions, we ensure our clients are equipped to match the pace of Industry.