Collaborative robots (cobots) are gaining traction in modern manufacturing facilities. Prices have steadily fallen into the realm of affordable for most small and medium sized manufacturers. By offloading the repetitive portions of a human counterpart’s workload, the cobot increases overall output, accuracy, and safety.

With COVID-19 forcing business to reevaluate collaboration in the workplace, cobots are entering conversations at an increasing rate. Where two humans may not be able to share a common space, a human and a cobot can do so safely without sacrificing productivity.

Cobots Are Built For – Wait For It – Collaboration

When we examine robotics and automation in the age of Industry 4.0, we can identity many familiar, though more archaic, forms of robotics in the manufacturing facility. Unfortunately, many of these examples pose quite a hazard to their human coworkers. Robot cell design has traditionally put human safety in the forefront and locked these machines in cages or behind walls.

Cobots break free of those bonds and work right alongside people. By design they are considerate of our fragility. Internal and external sensors ensure they speed up and slow down operations based on proximity to humans.

This is just one example of how collaborative robots are making their adoption more palatable in the factory.

With a starting price tag on most models circling $25,000, cobots are more affordable than they have ever been.

As adoption increases and innovation in human-cobot collaboration proliferates, we can expect to see even more exciting technologies and use cases emerge as Industry 5.0 approaches.

Collaborative Robots Make Smart Manufacturing Accessible

Not only can we program cobots to perform any series of dull, dangerous, and dirty tasks ad nausea, we can also ensure they adopt new methods of performing work easily.

This is in stark contrast to hard or fixed automation examples in industries like aerospace and automotive manufacturing.

Those robots, generally referred to as articulated robots, are built with one purpose in mind and are rigid in their programming. To change tooling and behavior takes significant effort. Too often resulting in lost production time. These robots even risk becoming obsolete as production changes over time.

On the other hand, collaborative robots can be reprogrammed and equipped with novel end of arm tooling to complete an array of complex tasks. Whether the job requires drilling, stamping, welding, gripping, or just about any other method of manipulation, chances are there is a cobot solution out there already or currently being developed.

The experience of programming cobots is becoming more intuitive every day. Examples like drag&bot and mblock come to mind, with simple drag and drop coding snippets that can program a robot to perform manufacturing tasks. as the coding environment becomes more accessible to workers who traditionally had a hands-on interaction with jobs, the result is new ways of achieving efficiency. This is in stark contrast to a computer science-oriented way of thinking about manufacturing scenarios.

Not only this, but career machinists can put their valuable knowledge to work in new ways that spark creativity, growth, and fulfillment in their career.

Where Cobots Fit Into The Manufacturing Facility

Collaborative robots in manufacturing can take the mundane and stressful in stride. Excelling at dull, dirty, and dangerous tasks is what they do best.

Consider cobots scaled down versions of other familiar robots already utilized in manufacturing facilities around the world.

Bench-top models resemble miniature articulated robots, at 3 to 4 feet in length. They are agile, lightweight, and customizable to an array of tasks. These same robots can also be mounted to mobile platforms. This can expand the cobot’s range of operation, assuming many different assembly tasks with quick changeover from station to station.

Mobile cobot examples are also gaining momentum ferrying materials and components to staging areas, so that staff don’t waste time walking from their bench to picking locations in your facility. Manufacturing and distribution centers gain incredible productivity when these useful robots can carry weights no human possibly could between storage and staging areas. The result is more employee engagement regarding the tasks at hand, fewer opportunities for accidents, and less time wasted walking the floor.

Mobile robots also make managing inventory much easier on your operations. They can complete non-stop audits of inventory, suggest amended sorting based on flow of goods, and other efficiency opportunities based on the data flowing from their sensors into your ERP system. Businesses can then bring planning and scheduling to greater levels of efficiency, driving supply chain management like never before.

Beyond the physical examples, there are digital examples of cobots to consider, too.

Chat bots answer visitor inquiries on your website and report back to your CRM with visitor profile data.

Robotic process automation (RPA) routines fall into this category, too. They handle repetitive data-entry and admin tasks at an exponentially faster rate than humans. Not only does this example enable immense cost savings for large operations, but smaller businesses can remain competitive by handling more tasks with fewer costs sunk into non-revenue generating activities.

Both examples automate repetitive tasks that free your staff to take on more creative and value-added work.

All these collaborative robots fit into your facility in some way or another and feed the information they gather into enterprise systems.

These data transactions inform every level of your business, enabling you to make more intelligent decisions on the fly.

The Elephant In The Room – Fear Of Replacement

Ever since robots made their debut on production floors, humans have feared being replaced. Robots don’t complain, they don’t get tired, and they don’t get paid.

Despite these perks for companies, perhaps the biggest drawback when it comes to robots is that they don’t think for themselves.

They do not innovate; they do not come up with more efficient ways of doing things and they certainly are not creative. In this sense, they are about as dumb as the sum of heir parts without a human giving the orders in the form of a program.

Before Covid-19, talent was hard to come by, especially for skilled trades in manufacturing. Cobots were one way to ease the burden these companies faced in an “sellers” market for talent. According to one 2018 Deliotte study, as many as 2.4 million U.S. Manufacturing jobs were expected to remain unfilled through 2028. This translated into an economic impact of $2.5 trillion.

The argument can be made that there are many skilled workers that complete repetitive tasks daily. These tasks are the target of cobot solutions. These tasks that wear on the hungry mind or eat at the fragile structures of the human body, the result is nagging injury and unfulfilled potential.

When cobots shouldering the burden of these tasks, their human counterparts can develop personally and professionally.

Take the Paradigm Electronics cobot example into consideration. The introduction of collaborative robots into the workplace didn’t lead to job loss. In fact, quite the opposite. Those staff who were machinists up-skilled to robot programmers. Those traditionally static jobs fell by the wayside to be scooped up by cobots. The resulting dynamic was not a job loss for humans, by a redefinition of their professional roles and responsibilities.

Collaborative Robot Proliferation

The rise of cobots has shown to bring more benefits than drawbacks.

Industries continue to adopt the technology and human workers see the benefits in their adoption.

Even start ups can reap the benefits of a workforce that doesn’t slow down and can always remain nimble to the changing landscape of business.

Whether it be on the manufacturing floor, the vineyard, or the back of the house at a restaurant, cobots are fast becoming a part of our daily lives.

Every day the price of cobots is driven down by fierce competition among robot manufacturers.

Innovations in sensing technologies and machine vision further increase efficiencies.

Currently, the cobot market is expected to exceed $11 billion by 2030. According to ABI research, the greater cobot ecosystem, which includes software, tooling, and customization, is expected to grow to $24 billion by the same time.

The above picture reveals a bright future for manufacturers and workers as this exciting technology develops.

About Encompass Solutions

Encompass Solutions is a business and software consulting firm that specializes in ERP systems, EDI, and Managed Services support for Manufacturers and Distributors. Serving small and medium-sized businesses since 2001, Encompass modernizes operations and automates processes for hundreds of customers across the globe. Whether undertaking full-scale implementation, integration, and renovation of existing systems, Encompass provides a specialized approach to every client’s needs. By identifying customer requirements and addressing them with the right solutions, we ensure our clients are equipped to match the pace of Industry.


‘Daisy’, the disassembling robot, is Apple’s answer to reclaiming the valuable materials that go into the creation of nearly every Apple mobile and tablet device. Aluminum, cobalt, gold, silver, platinum, and many other valuable metals and rare earths can be extracted using this useful technology. Daisy’s efficiency is punctuated by an exceptional rate of recovery, disassembling and collecting materials from outdated devices at the rate of 200 iPhones per hour.

Not only has the recycling robot enabled Apple to recover these valuable resources, but the machine’s creation process has yielded valuable information on recycling many of these materials in a cleaner and more efficient manner. New processes have eliminated the need to introduce contaminants and other dangerous substances into the recycling process, leading to an unsullied recycling process all around.

Daisy is not the first iteration of a mechanical recycler, though. Liam was the predecessor announced by Apple back in 2016. A very specialized robot, Liam was designed to specifically disassemble iPhones to access the recyclable materials inside. Some of the crucial components Liam sought, and Daisy seeks, out in each iPhone carcass include cobalt and lithium from the phone’s battery, gold and copper from the camera, silver and platinum from the device’s logic board, and aluminum from the enclosure.

Additional Charity With Apple’s GiveBack

Alongside the Daisy announcement, which coincided with Earth Day, Apple has announced the commitment to match customers turn-ins of devices with charitable contributions to the Conservation International environmental non-profit based in Virginia until April 30th. Some devices being turned in will even knab in-store gift cards and credit for those donating.

The press release detailing Daisy and Earth Day campaigns from Apple, along with all media associated with the announcement, can be found in the Apple Newsroom.

Cleaner Streams Of Recycling With Material Recovery Facilities

Daisy’s announcement is just one of the many emerging advancements taking place in the world of Material Recovery Facilities (MRFs). The recent WasteExpo 2018 in Las Vegas highlighted many of the recent advancements in the world of cleaner recycling and material recovery in electronics recycling, cleaning and sorting equipment, and municipal recycling endeavors.

Robotics and Artificial intelligence, in particular, are assuming significantly larger roles in the advancement of recycling efforts, enabling greater efficiencies in:

  • Heavy lifting
  • No deviation due to fatigue
  • Repetitive tasks
  • Continuously high levels of concentration
  • Purity rates and consistent and accurate identification of products
  • Pre-emptively tracking and managing work
  • Maximal operating time
  • Evolutive identification of products and more meaningful data
  • Reproductivity of results
  • Reduced labor and training
  • Lower operating costs

While these innovations in the field of material recovery have enabled companies like CleanRobotics, and AMP Robotics to function with greater efficiency, difficulties remain within the variety of materials flowing into the recycling stream. The resounding answer to the challenge emerges again and again with machine vision.

Material Recovery Pushes Advances In Machine Vision Systems

Coupled with robots on recycling conveyor systems, machine vision systems identify elements and materials according to a number of characteristics. Once identified, the robotic component will employ suction, grippers, and grabbers to remove materials from the conveyor and sort them accordingly, for either direct recycling or further disassembly, if necessary. Eagle Vision and Bulk Handling Systems are two entities addressing the need for more robust machine vision systems in MRFs.

Coupled with robots on recycling conveyor systems, machine vision systems identify elements and materials according to a number of characteristics.

Interconnectivity between MRF system components, read as the Industrial Internet of things (IIoT), allows devices to “speak” with one another across the facility. The results can be presented as simply as “I’m getting too much plastic” to which screens can be adjusted to narrow or expand the flow of specific materials, according to Nathanaël Lortie, co-founder and president of Eagle Vision.

With accuracy rates reaching upwards of 85 -95 percent, these robots and their associated systems far surpass the publics shoddy-by-comparison 30 percent accuracy.

About Encompass Solutions

Encompass Solutions, Inc. is an ERP consulting firm that offers professional services in business consulting, project management, and software implementation. Whether undertaking full-scale implementation, integration, and renovation of existing systems or addressing the emerging challenges in corporate and operational growth, Encompass provides a specialized approach to every client’s needs. As experts in identifying customer requirements and addressing them with the right solutions, we ensure our clients are equipped to match the pace of Industry.


Over the last half-century, robots have been relied upon as an integral part of manufacturing. Their presence offers incredible benefits, including enhanced production speed, accuracy, and tireless labor. However, they can’t do it all. As a result, robots have been increasingly prevalent in the manufacturing environment year after year. Engineered to work collaboratively alongside their human counterparts, these smaller and agiler implements on the manufacturing floor are referred to as collaborative robotics.

Breaking Down Collaborative Robotics

Rather than replacing a worker in completing a specific task, collaborative robots, more colloquially referred to as cobots, are designed to enhance a production team’s capabilities on the shop floor. With modern, plug-and-play functionality, cobots shed their larger predecessors’ bulky protective cages and lend precision, power, and agility to their teams.

a picture of a single arm robot and manufacturing employee working simultaneously as an assembly station where collaborative robotics are in use

As innovations in robotics make these tools more user-friendly and affordable, smaller manufacturing operations will benefit from their integration.

Coming in at around 3-feet in height, these lightweight additions to the manufacturing process are quickly becoming fast friends with their increasingly specialized human coworkers. This is due in large part to the shedding of misguided anxieties that robots will replace workers outright, which is quite unlikely to take place within our lifetime. In fact, it is increasingly recognized that collaborative robots elevate human workers from positions of mundane and repetitive natures to more specialized roles, valued more highly by both companies and individual workers alike.

Primary Cobot Operations

While cobots can certainly be designed for specific purposes and roles according to customer specifications, by and large, they are intended for specific types of operation within the manufacturing environment.

The most common utilization of cobots emerges in power and force limiting roles, whereby robots are in close proximity to people and must remain continually aware of the power and force that humans can withstand at all times. This means the acceptable level of energy transferred through sudden impacts and collisions with people is maintained below a maximum threshold determined by the International Standards Organization (ISO).

A picture of a manufacturing floor robot arm where collaborative robotics are in use

By and large, collaborative robotics are intended for specific types of operation within the manufacturing environment.

Sensing and monitoring technologies come into play to enable safety monitored stop operation in cobots, where nearness to humans is the primary focus. In this case, the response to a human passing the proximity threshold is for the cobot to terminate motion. The same technology is used in speed and separation monitoring, whereby the speed and proximity of a human are tracked in relation to a cobot’s position. Preventative measures are taken, namely a reduction in the speed of operation, as the worker draws closer to the cobot in operation.

The final common collaborative cobot operation resides in hand-guiding, which is reserved primarily for delicate production processes. In such operations, pressure sensing arrays enable cobots to learn from operators the proper orientation and speeds at which objects can be manipulated without causing damage.

Where Cobots Lose Sight

With such great market potential, collaborative robotics will no doubt be increasingly adopted by manufacturing operations of every size. Small and medium businesses will benefit substantially from the drop in supplier prices as the technology becomes more readily available, as well.

A picture of a worker and cobot working in manufacturing facility where collaborative robotics are in use

For cobots, working alongside often spontaneous and unpredictable human colleagues poses the greatest opportunity for incidents to transpire

While all this is true, there is still plenty of room for improvement when it comes cobots. In particular, machine vision will make substantial headway for cobots in manufacturing, where working alongside often spontaneous and unpredictable human colleagues poses the greatest opportunity for incidents to transpire. Additionally, humans may complete tasks in a  number of different ways and place the tools they use in different locations for later use. What experts see as the zenith of collaborative robotics resides in the ability for cobots to one day anticipate their human colleagues’ needs and provide the necessary tools or support to complete tasks in real time. Machine vision is seen as the primary component in making this idea a reality. Those days are still a long way off, but the developments taking place now bring the possibility nearer with each passing day.

About Encompass Solutions

Encompass Solutions is a business and software consulting firm that specializes in ERP systems, EDI, and Managed Services support for Manufacturers and Distributors. Serving small and medium-sized businesses since 2001, Encompass modernizes operations and automates processes for hundreds of customers across the globe. Whether undertaking full-scale implementation, integration, and renovation of existing systems, Encompass provides a specialized approach to every client’s needs. By identifying customer requirements and addressing them with the right solutions, we ensure our clients are equipped to match the pace of Industry.


The Chief Robotics Officer

With the industrial robot population on course to reach 1.7 million by 2020, enterprises the world over are reevaluating how they approach managing a mechanized workforce. To address modern manufacturing operations’ increasing reliance on automation and robotics, the C-suite is preparing to welcome a new designation among its ranks: Chief Robotics Officer (CRO).

A photograph of a Chief Robotics Officer commanding a humanoid robot

Tthe C-Suite Is Preparing To Welcome A New Designation Among Its Ranks: Chief Robotics Officer.

Addressing An Increasingly Automated Workforce

While the concept may be a new one, a study conducted by Myria Research, a Massachusetts-based research and advisory services business, puts projections of the emerging Chief Robotics Officer position in 60% for Fortune 500 executive teams. Beyond that, the Chief Robotics Officer Research Scenario predicts the Robotics & Intelligent Operational Systems (RIOS) technologies market to reach $1.2 trillion globally by 2025. The figure is tremendous when compared to the firm’s $63 billion market valuation in 2015. However, companies cannot afford to discount the increasing prevalence of RIOS in their own daily operations as well as those of their competitors. The projections include full hardware, software, and services segmentation in the figure, which represents a 30% CAGR to 2020 and 40% from 2020 until 2025.

A photo of a robotic arm working on a factory floor.

the Chief Robotics Officer Research Scenario predicts the Robotics & Intelligent Operational Systems (RIOS) technologies market to reach $1.2 trillion globally by 2025.

The Chief Robotics Officer remains a conceptual position for the vast majority of enterprises. However, research suggests that as the prevalence of integrated robotics and intelligent systems increases, CROs will become equally important sources of vital information that lend insight into research, development, and innovation in the sector. Their duties can be likened to that of the Chief Information Officer during the influx of computers into businesses during the 1980’s. Already there is an abundance of seminars professing the need for CRO’s is enterprises want to remain competitive. In short, a company’s CRO will be responsible for implementing the latest robotics and automation technologies while increasing efficiency and through the productivity promised by robotics and AI. A hands-on component is also likely for the role, where managing these robotic and digital workforces on the floor as well as in the cloud will pose unique challenges that only a human component can address.

How RIOS Affects The Labor Market

While it is already painfully obvious that automation, robotics, and intelligent operational systems are positioned to hamstring low-skilled labor, the specialized staff required to maintain and interact with these systems is grown at a steady rate alongside implementation. This leaves a wealth of opportunity for those individuals being transitioned out of low-skill positions to take the reigns and receive the training to assume the role of technological overseers.

a photograph of factory workers and the production process.

Factory Workers Need To Be Prepared For The Shift From Low-Skilled To Specialized. Fortunately, Many Corporations Are Supporting Them Through The Transition.

Many companies are already supporting workforces keen on being promoted from within. The most prominent example comes from Amazon, which is offering sizable grants and educational opportunities in-house for any employee willing to beef up their background in robotics, automation, and IT.

Human-Robot Interaction In The Workplace And Beyond

Thanks to the stories spun in science fiction and technological horror films, humans have developed a healthy apprehension of integrating machinery too deeply into their lives. However, the buffer may be unfeasible as the efficiencies these machines and processes provide are simply too valuable to ignore.

a photo of a Chief Robotics Officer and a robot discussing information on a computer monitor

Advances in artificial intelligence bring machines to a higher level of competency regarding cooperation and collaboration with their human counterparts

The results? A new classification of workplace interaction. Questions are already emerging about human-robot interaction in the workplace and beyond. Coworkers want to know how they’ll split the workload, if their vacation time and salary will be affected, or if there will be robots on their departmental teams with a say in how projects are carried out. It may seem a silly distinction to make at this point in time but as advances in artificial intelligence bring machines to a higher level of competency regarding cooperation and collaboration with their human counterparts, these human anxieties may become increasingly more valid.

Robot Personhood And Emerging Legal Frameworks

At a higher level, governments are interested in how commerce is positioned to be improved and how administrations will have to adopt new legal frameworks to address things like liability and taxation. The EU is already considering granting personhood to robots. While corporate personhood in the US bestows certain rights and privileges to business entities, the proposed rules placed in front of EU legislators suggested by EU legislators are seeking to address these issues of liability, safety, and changes in the labor market. A study on the notion of the robot, its implications, and the question of consciousness, authored by Nathalie Nevejans, Université d’Artois, Centre de Recherche en Droit Ethique et Procédures, can be found here. Rapporteur Mady Delvaux’s (S&D, LU) EU Commission report on liability, social impact, and request for legislation can be found here.

About Encompass Solutions

Encompass Solutions is a business and software consulting firm that specializes in ERP systems, EDI, and Managed Services support for Manufacturers and Distributors. Serving small and medium-sized businesses since 2001, Encompass modernizes operations and automates processes for hundreds of customers across the globe. Whether undertaking full-scale implementation, integration, and renovation of existing systems, Encompass provides a specialized approach to every client’s needs. By identifying customer requirements and addressing them with the right solutions, we ensure our clients are equipped to match the pace of Industry.


Space Robots

With the successful launch and reentry of Tesla’s Falcon Heavy rocket, now is an excellent opportunity to talk about space robots, machine vision, and their roles in expanding space research and exploration.

 

a still of the starman mannequin and tesla roadster with earth in the background from the SpaceX.com live feed..

A Stunning View Of Earth Captured Following The Falcon Heavy Launch. Photograph: SpaceX.com Live Feed.

 

Space robots. Emulated after us in terms of morphology and size, they are superior to industrial robots when it comes to versatility and capability. While right now they may not look as advanced or operate as nimbly as their representations in sci-fi features from the Star Wars and Star Trek franchises, that gap is quickly shrinking. Taking on repairs and other tasks deemed too dangerous for astronauts, these specialized robots are the obvious candidates for many of the precarious activities taking place beyond the relative comfort of Earth.

Space Robots: R2 Goes To The International Space Station

The first humanoid robot in space, Robonaut 2, R2 for short, was developed by the Dextrous Robotics Laboratory at Johnson Space Center (JSC) in Houston, Texas. R2 emerged earlier this decade as the latest subject of robotics research in space. Originally consisting of an upper only torso and arms, R2 has now been equipped with two climbing manipulators, read as legs, capable of providing mobility in zero-g environments to complement dexterous arms and digits that handle intricate tasks. R2’s evolution is a marvel for researchers and enthusiasts to behold, but what’s more impressive than the achievements made over its predecessor, R1, are the advanced sensing capabilities that allow R2 to truly perform in one of the most challenging environments imaginable.

Space Robots: A picture of Robonaut 2 aboard the international space station.

Space Robots: Robonaut 2 Working Tirelessly Aboard The International Space Station.

Machine Vision, Sensing, And Perception

The abilities to touch and see are perhaps the most extraordinary components of these robots’ capability. Vision and sensing components relay complex sets of data such as the identity, position, and orientation of objects in an image. Powerful industrial machine vision and process guidance systems are allowing next-generation robots the ability to evaluate and react effectively in real-time.

Without the component of machine vision, robots are little more than extensions of their controllers and the setpoints governing automated tasks. In R2’s case, 3D vision is the component of machine vision that allows it to perform complex tasks in a semi-autonomous fashion. R2 is both capable of remote control by operators and semi-autonomous operation using advanced software that lets R2 “think” of the solution to a given task. Software updates regularly expand the depth and breadth of R2’s capability. R2’s vision is governed by five cameras in all. Two to provide stereo vision for the robot and its operators, and two auxiliary cameras for backup use. The component of stereo vision allows images from two vantage points to be compared, effectively allowing R2 – and us – to see in 3D. A fifth infrared camera is contained within the mouth area to aid in depth perception. All vision components are housed within the cranium, while R2’s “brain” is located within the robot’s torso. R2 can look up and down, left and right, to fully gauge its surroundings.

 

Space Robots: a picture of R2 with legs attached

Space Robots: R2 Equipped With Two Climbing Manipulators, Read As Legs, Capable Of Providing Mobility In Zero-G Environments.

 

A prime example of cooperative robotics at work, R2’s ability to interact with the astronauts on the ISS mimics the way another person might. Operating at a pace relative to humans, R2 has a soft, padded skin that is equipped with sensing systems that allow it to react when encountering a person. Force control is provided by torsion springs inside the robot and allow R2 to react to influence from the environment. So, when a person pushes away an arm, R2 gives to the motion and lets the person by. This sensing capability also provides R2 with continuous awareness of its orientation and the location of its limbs relative to the environment and surrounding people.

Object Interaction

As for Robonaut’s interaction with its environment, its hands work a bit differently than both humans’ and industrial robots’. The key difference resides in R2’s tendon-driven robotic fingers. Typically, robots will control their joints via tension controllers located on each tendon individually. Putting it another way, joint torque translates into tendon tension.  This poses a problem in the case of R2. The resulting disturbances between joint displacement and the tendon had to be addressed for R2 to be able to interact with unfamiliar objects in an array of positions. This is in stark contrast to R2’s industrial cousins, which operate in uniform spaces with familiar objects. The solutions to R2’s dilemma came when NASA and GM engineers devised a joint-based torque control method that decouples the tendon. All this talk about torque is of particular importance for R2, as well as many other humanoid-robots, due to the necessity for adaptable grip when interacting with a variety of objects large and small.

Space Robots: A picture of Robonaut 2 holding different tools

Space Robots: Robonaut 2 Is Capable Of Working With An Array Of Tools. Photographer: Robert Markowitz.

What’s Next For The ISS And Non-Human Crewmembers

The most recent iteration of Robonaut coming from Houston’s JSC is R5, or Valkyrie. Built to compete in the 2013 DARPA Robotics Challenge (DRC) Trials, the design of Valkyrie took place over a 15-month period and improved electronics, actuators, and sensing capabilities based on earlier generations of JSC humanoid robots. In particular, R5’s vision and sensing system improvements are a tremendous advancement over than those found in R2. Valkyrie’s redesigned head sits on a neck possessing three degrees of freedom and features a Carnegie Robotics Multisense SL, a tri-modal (laser, 3D stereo, and video), high-resolution, high-data-rate, and high-accuracy 3D range sensor, as the main perceptual sensor. Additional modifications include infrared-structured light point cloud generation beyond the laser and passive stereo methods already implemented, as well as front and rear “hazard cameras” positioned in the torso.

Space Robots: A picture of robonaut 5 with hands on hips.

Space Robots: The Latest Iteration Of Robonaut, Robonaut 5, Is Also Referred To As Valkyrie And features The Latest Tech In Robotics For Space Applications.

As research advances technology here on the ground, components and software can be sent to the ISS for utilization. Once proven to operate effectively on the ISS, NASA and other robotics laboratories hope that innovative robotics and associated technologies can be applied further in the depths of space. In the future, thermal resistance for robots will likely be a main focal point for researchers.

About Encompass Solutions

Encompass Solutions is a business and software consulting firm that specializes in ERP systems, EDI, and Managed Services support for Manufacturers and Distributors. Serving small and medium-sized businesses since 2001, Encompass modernizes operations and automates processes for hundreds of customers across the globe. Whether undertaking full-scale implementation, integration, and renovation of existing systems, Encompass provides a specialized approach to every client’s needs. By identifying customer requirements and addressing them with the right solutions, we ensure our clients are equipped to match the pace of Industry.