It is becoming an annual tradition: Robocpetion once again hosts a workshop during the European Robotics Forum.
About the Workshop
Perception is one of the key technologies for flexible production, including applications such as pick and place, machine tending, assembly, finishing, and quality testing. Applications in domains outside traditional manufacturing scenarios are in general more challenging, but the COVID pandemic has highlighted the need to use automation in environments where the health risks for humans are too high, e.g. test sampling and processing facilities.
The introduction of robotics in human-dominated environments demands a fast adaptability of the system. Recent developments in 3D object detection and pose estimation algorithms as well as machine learning opened up new ways in these challenging domains.
Therefore, the combination of machine learning and classical methods aim at providing reliability, robustness and flexibility at the same time to fulfill the requirements. This connection of these methods with innovative perception approaches show great potential for coping with requirements in lab automation, agile production and smart logistics.
In this workshop, use cases from industry are presented and then discussed in an interactive session with the attendees. The main goal is to create synergies and potential collaborations between researchers and industry, to facilitate the introduction of recent perception technologies into new scenarios.
The implementation of smart automation solutions is at the top of your agenda this year? We may be able to help:
Reliable vision solutions are the key to increasing flexibility and efficiency. Bin picking, machine tending and the quick adaptation to new tasks or workpieces: Roboception offers complete hardware and software systems, that have been tailored to robotic applications.
Learn more in our webinar
In our 1-hour webinar, let us introduce our innovative portfolio, from the rc_visard stereo sensor to the latest addition to our rc_reason software suite: The CADMatch Module, which localizes objects independent of their position and orientation, and reliably delivers grasp points while avoiding collisions with the load carrier or other objects.
Not a vision expert? Not a problem! Our user interface is easy and intuitive to handle: Even without any prior experience in robot vision, you’ll be able to deliver e.g. grasp points to your robot in no time!
Experience a live view of our hardware and software, and get your questions answered straight away.
These are the next dates:
15 April, 11:00-12:00 CET (German) I Sign up…
20 April, 16.00-17.00 CET (English) I Sign up…
18 May, 11:00-12:00 CET (German) I Sign up…
19 May, 17.00-18.00 CET (English) I Sign up…
These webinars are conducted via Zoom. They are of course free of charge, however, a registration is required.
The dates don’t work for you, you’re in a rush, prefer another platform or would like to discuss concrete use cases? Let’s set up an individual slot: Contact us or – easier yet – book your personal demo slot here…
With its innovative hardware and software products, Roboception is a pioneer in 3D sensor technology: We give the robots eyes – and hence deliver key elements of our customer’s most forward-looking automation solutions.
We are currently strengthening our teams with
You are interested in helping to develop our company, and to grow with us? Please send your complete application documents including your salary expectations to email@example.com.
The rc_reason CADMatch Module is now ready to order in our webshop.
Enable your robotic system to reliably detect, localize and pick objects from unmixed load carriers, fully independent of the object‘s position and orientation: Simply feed a CAD-based template into the CADMatch Module, define one (or several) grasp point(s) on your object with just a few mouse clicks, and easily deliver task-relevant information for your robotic application.
A machine learning approach enables the usability, while a classical refinement ensures accuracy – this combination of ‘the best of both worlds’ is AppliedAI at its best!
Would you like to see for yourself?
Get in touch to schedule a demo and/ or send us a CAD model of your object. Within a day or so, we will provide a report on the expected accuracy of the detection. That’s free of charge, of course!
Our perception solutions have been highlighted by the European Commission’s Innovation Radar as an excellent, market-ready innovation that has benefited from EU Funding – in this case from the Horizon 2020 project THOMAS, in which we developed some components of the rc_reason software suite.
The Innovation Radar platform builds on the information and data gathered by independent experts involved in reviewing ongoing research and innovation projects funded by the European Commission. These experts also provided an independent view regarding the innovations in the projects and their market potential.
Well, what can we say? Thank you for the funding, of course, as well as for highlighting us on the Innovation Radar!
For more on Project Thomas, click here…
The countdown is running: Roboception will release the rc_reason CADMatch Module later this year. And while the company’s perception experts are still completing final touches on this software, an Early Access Program is about to start – with a few slots still available.
The CADMatch software module will enable the rc_visard to reliably detect and localize unmixed objects e.g. in bins or cages, fully independent of their position/ orientation, based on a previously taught CAD Model. The software will also allow the user to specify one (or even several) grasp points per object, hence enabling picking by a two-finger gripper or a suction device.
Users, in particular in the automation domain, are eagerly awaiting this component, as the functionality is key to an increased efficiency in many automation processes – with machine tending being a prominent (but by far not the only) example.
Following the keen interest of numerous customers, Roboception has now launched the Early Access Program, providing users access to a pre-release version of the new solution.
“This obviously gives the users a head start on their tests and evaluations, as well as the preparation of the actual implementation”, says Elena Gambaro, Roboception’s Product Owner for the rc_reason Software Suite. “At the same time, our teams will be able to integrate highly valuable user feedback as we optimize the software over the coming months – a clear win-win for everyone.”
The Early Access Program can still accommodate a few additional users that are interested to start evaluating the software over the summer. Interested parties are invited to contact firstname.lastname@example.org with a brief description of their use case/ objects.
This sensor variant features lenses with a focal length of 6 mm. This does lead to a slightly reduced field of view (and hence also to a smaller workspace), on the other hand, it delivers a higher resolution and accuracy – which is useful especially for bin-picking applications with rather large viewing distances. Or, as we like to put it: With the 6-mm variant, you can ‘put more pixels in the bin’.
At a distance of 3 m, a depth resolution of 2,2 mm is achieved, at 2 m distance it even reaches 1,0 mm. The field of view of 43° (horizontal) and 33° (vertical) results in a workspace of 2,24 m x 1,80 m (@ 3 m) und 1,44 m x 1,20 m (@ 2 m).
After this sensor variant was first implemented as a dedicated solution for a specific logistics application, but is proving to be increasingly popular with customers especially in the logistics domain – and has hence been added to the off-the-shelf portfolio.
The German trade magazine MaschinenMarkt has now published a case study titled ‘Selbstlernendem Roboter gelingt das Bin Picking’ (Self-Learning Robot Manages Bin Picking). It presents the innovative ‘Pick Center Rovolution’ of our customer TGW Logistics Group, which relies on two rc_visard 160 as its vision component.
Good news for all users of Universal Robots: Highly efficient pick-and-place applications, machine tending, bin picking or (de)palletizing have just become easier for your UR: Roboception‘s rc_visard 3D stereo sensor and innovative rc_reason software modules are now fully compatible with Universal Robots through a URCap.
Getting started is as easy as can be:
rc_visard users that switch to a UR robot can simply download the URCap (free of charge) from Roboception’s website.
New rc_visard users can purchase their hardware and software solution as part of our Universal Robots Pick Package that includes a ‘starter kit’: It contains the URCap, a tailored calibration grid and a set of cables and connectors plus the selected sensor and a software component (if any).
Until September 2020, the sensor is a part of the exhibition ‘Human-Nature’: The value of good design, demonstrated in an exhibition through parallel questioning in a world where humans achieve enhanced abilities through technology, algorithm and machine learning, where co-existing with robots is a reality.
If you happen to be in town, do stop by for a visit – and/ or read more about this exciting exhibition here…
Picture: Red Dot Design Museum