What happens when two technology leaders sit down to discuss the future of sensing and perception? In this fireside chat from Aeva Day, Aeva CTO Mina Rezk and LG Innotek CTO Dr. S. David Roh share their vision for how FMCW technology and advanced sensor integration will power the next era of Physical AI including smart mobility, robotics and automation. Watch the full conversation here: https://bit.ly/4h5Cr4U
Train Ultralytics YOLO on rare and fine-grained objects with the LVIS Dataset!
LVIS (Large Vocabulary Instance Segmentation) offers 2M+ segmentation masks across 1,200+ categories, capturing real-world object frequency, from common to rare.
It’s a powerful resource for:
✅ Long-tail object detection and segmentation
✅ Open-world vision and generalization
✅ Applications in robotics, retail, and autonomous systems
✅ Research in rare-object handling and instance-level understanding
Supported natively in the Ultralytics ecosystem, LVIS helps you push model generalization to the next level.
Explore the dataset ➡️ https://ow.ly/lzC150X01UH
Discover Neuron, a cloud-based solution that leverages NVIDIA Omniverse and Gen AI to build digital factory simulations, along with advanced automation and robotics capabilities. Together, we’re driving innovation in manufacturing, improving physical systems, optimizing processes, predicting issues and ultimately cutting costs and boosting productivity. Learn More! https://accntu.re/4kFg7jY
DO YOU KNOW?
TriEye is redefining short-wave infrared (SWIR) imaging by leveraging CMOS-based sensor technology to overcome the cost and scalability limitations of conventional InGaAs sensors. Its proprietary platform integrates SWIR detection capabilities directly into standard silicon processes—enabling compact, low-cost, and mass-producible imaging solutions. This innovation opens the door for advanced sensing across multiple domains, including automotive ADAS and autonomous systems, precision industrial inspection, biometric authentication, and high-performance mobile imaging. Founded in 2017, TriEye has secured $96 million in funding to accelerate commercialization of its CMOS-SWIR technology and extend the reach of infrared vision into mainstream applications. More at www.trieye.tech.
Wrapping up our CeMAT 2025 highlights with Hall N1 — the core stage for smart mobile robotics.
If Hall W3 was about the mechanical backbone of automation, N1 was the brain 🧠— where AGVs and AMRs evolve into intelligent, connected, and adaptive systems.
Across the floor, we saw the industry’s sharpest focus on navigation algorithms, sensor fusion, scheduling software, and multi-robot coordination. Vision-based SLAM, semantic mapping, and composite robots (AMR + manipulator) have moved from concept to real-time demos — signalling that mobility is becoming more perceptive, collaborative, and context-aware.
What stood out most was the pace of innovation 🚀. “New release” tags were everywhere — from upgraded perception modules to lightweight modular platforms and cloud-edge scheduling. It’s clear that China’s AMR ecosystem is shifting from single-vehicle performance to system-level intelligence.
Watching these robots plan, adapt, and collaborate in real time felt less like automation — and more like witnessing a distributed intelligent system taking shape, one that’s redefining how factories think, move, and coordinate.
#CeMAT2025#HallN1#AMR#AGV#SmartLogistics#Automation#AI#VisionNavigation#Intralogistics#SystemIntegration#ATOMBOTIX
For robots to become true partners in our daily lives, vision alone is not enough. By bringing the sense of touch to machines, we open the door to more natural, adaptive, and human-like interaction.
With advances in tactile sensing, we are building a future where robots can grasp, feel, and respond with the same dexterity we rely on every day. Learn more: https://bit.ly/4mTOVhv
Part of Analog Devices physical intelligence agenda is the pursuit of touch via the Emergent AI team under Massimiliano Versace and Tao Yu, some of the brilliant minds marrying AI software innovation with ADI’s deep sensor and analog heritage.
Physical intelligence is what ADI already possesses via some of the most exacting engineering know-how in the silicon industry. Adding it to foundation models to align them with the stubborn realities of physical-world problems is the natural next step. Reflexive control of humanoid hands via touch is just one example.
For robots to become true partners in our daily lives, vision alone is not enough. By bringing the sense of touch to machines, we open the door to more natural, adaptive, and human-like interaction.
With advances in tactile sensing, we are building a future where robots can grasp, feel, and respond with the same dexterity we rely on every day. Learn more: https://bit.ly/4mTOVhv
Manufacturing is moving beyond automation toward true autonomy.
At Amillan, we explore how Agentic AI, robotics, and intelligent networks are reshaping factories into adaptive, connected ecosystems.
Our latest thought piece reveals how Managed Service Providers, like Amillan, can bridge IT and OT, enable AI-native networks, and drive smarter operations across every level of manufacturing.
📩 Contact Amillan to explore how we can help your business build intelligent, future-ready infrastructure.
👉Read more about the next frontier of intelligent infrastructure below
🤖 Bimanual assembly requires more than just vision, it needs touch.
VT-Refine introduces a real-to-sim-to-real framework that combines simulation, vision, and touch to train robots for precise manipulation with minimal real-world data.
📗 https://lnkd.in/gumSZc56#NVIDIAResearch
A perfect example of how tactile sensor data and the vision-tactile sensing fusion could significantly enhance robotic manipulation👍. TouchTronix Robotics provides both standard and customized "Plug-and-Play" tactile array sensor modules for 2/3-finger gripper or 5-finger hand with high performance and reliability: https://lnkd.in/eT8GMWQG
🤖 Bimanual assembly requires more than just vision, it needs touch.
VT-Refine introduces a real-to-sim-to-real framework that combines simulation, vision, and touch to train robots for precise manipulation with minimal real-world data.
📗 https://lnkd.in/gumSZc56#NVIDIAResearch
🛫 Last week, our colleagues travelled to #Tolbert (The Netherlands) to take part in the integration workshop of AIMEN’s developments in the Menicon demonstrator, within the framework of the SMARTHANDLE EU Project.
This integration took place at the facilities of project partner STT Products BV, together with Demcon and #Menicon. #AIMENCT contributed to key developments:
👁️ a visual sensor for detecting contact lenses and reading a code engraved on their surface, and
✋ a tactile sensor enabling robotic manipulation of the lenses.
#SMARTHANDLE, a Horizon Europe RIA project, aims to develop intelligent and reconfigurable gripping systems to make manufacturing lines more flexible, autonomous and efficient — supporting Europe’s transition towards smarter, safer and more sustainable production.
#AIMENResearch#HorizonEurope#SmartManufacturing#Robotics#Innovation#Automation#Industry40#AI#Sustainability
looking sharp Mina Rezk!