6 min read

Nov - Week 3- Robotix Pulse

Nov - Week 3- Robotix Pulse
Photo by Maximalfocus / Unsplash

1- Here are this Week's Robotics Engineering Jobs. [the list has been moved to the page that includes all benefits for paid members]


Udacity offers a fully online Master's in AI: Check it out here:

Use code "SINA40" to get 40% off

udacity.png logo-udacity.png


2- This Week's LinkedIn posts:

๐—ฉ๐—ฃ (Vice President) Career Path Study: ๐—ฃ๐—ฎ๐—ฟ๐˜ ๐Ÿฎ: What Did Tech VPs Study & what School did they graduate from? Last week in Post #1, we kicked off a series where we analyze ~100 LinkedInโ€ฆ | Sina Pourghodrat (PhD)
๐—ฉ๐—ฃ (Vice President) Career Path Study: ๐—ฃ๐—ฎ๐—ฟ๐˜ ๐Ÿฎ: What Did Tech VPs Study & what School did they graduate from? Last week in Post #1, we kicked off a series where we analyze ~100 LinkedIn profiles of VPs from top tech companies: ๐˜ˆ๐˜ฑ๐˜ฑ๐˜ญ๐˜ฆ, ๐˜•๐˜๐˜๐˜‹๐˜๐˜ˆ, ๐˜Ž๐˜ฐ๐˜ฐ๐˜จ๐˜ญ๐˜ฆ, ๐˜ˆ๐˜ฎ๐˜ข๐˜ป๐˜ฐ๐˜ฏ, ๐˜”๐˜ช๐˜ค๐˜ณ๐˜ฐ๐˜ด๐˜ฐ๐˜ง๐˜ต, ๐˜”๐˜ฆ๐˜ต๐˜ข, ๐˜›๐˜ฆ๐˜ด๐˜ญ๐˜ข, ๐˜๐˜ฏ๐˜ต๐˜ฆ๐˜ญ, ๐˜ข๐˜ฏ๐˜ฅ ๐˜“๐˜ช๐˜ฏ๐˜ฌ๐˜ฆ๐˜ฅ๐˜๐˜ฏ. Our ๐—ด๐—ผ๐—ฎ๐—น is simple: Understand who these leaders are, their background, education, and the paths they took. In Part 1 [https://lnkd.in/ePBjhaCT], we shared two surprising insights: - 86.3% of VPs do not have an MBA - Their highest degrees lean heavily toward a graduate degree (masterโ€™s & PhD) ๐ŸŽ“ ๐—ฃ๐—ฎ๐—ฟ๐˜ ๐Ÿฎ: ๐—ช๐—ต๐—ฎ๐˜ ๐——๐—ถ๐—ฑ ๐—ง๐—ต๐—ฒ๐˜€๐—ฒ ๐—ฉ๐—ฃ๐˜€ ๐—ฆ๐˜๐˜‚๐—ฑ๐˜†? Based on our dataset, here are the top majors for tech VPs (grouped & normalized): - CS/CE (Computer Science / Computer Engineering) โ€“ largest share by far - EE (Electrical Engineering) - ME (Mechanical Engineering) ๐Ÿซ ๐—ช๐—ต๐—ฒ๐—ฟ๐—ฒ ๐——๐—ถ๐—ฑ ๐—ง๐—ต๐—ฒ๐˜† ๐—ฆ๐˜๐˜‚๐—ฑ๐˜†? While Stanford and UC Berkeley are top U.S. universities they got their highest degree from, there is ๐—ป๐—ผ single โ€œmust-attendโ€ university. Yes, top schools show up, but the majority come from a wide range of universities, including many you wouldnโ€™t expect. ๐Ÿ“Œ From an ๐—ฒ๐—ฑ๐˜‚๐—ฐ๐—ฎ๐˜๐—ถ๐—ผ๐—ป perspective, our dataset suggests that MBAs are relatively uncommon among these VPs, and school names vary widely. The most frequent pattern we see is VPs holding graduate-level technical degrees, particularly in CS/CE, though this may not represent the entire industry ---> Follow along: more insights coming in this series. At the end of this series, weโ€™ll release the full dataset ------------------------------------------------------------------------------------- For readers interested in upskilling in AI, Udacity currently has a 55% discount on its online ๐— ๐—ฎ๐˜€๐˜๐—ฒ๐—ฟโ€™๐˜€ ๐—ถ๐—ป ๐—”๐—œ using code BLACKFRIDAY55 [sponsored & affiliate link]: https://lnkd.in/eTMFKN72 ----------------------------------------------------------------------------------------
UC San Diego Researchers Teach Robots with Human Videos | Sina Pourghodrat (PhD) posted on the topic | LinkedIn
๐—œ๐—ป-๐—ก-๐—ข๐—ป: Teaching #robots from real first-person human videos Robots usually learn from small, clean datasets collected in labs. But real life is messy. Humans have thousands of hours of first-person videos showing how we grab objects, cook, clean, open drawers, and do everyday tasks. ๐—ง๐—ต๐—ฒ ๐—ฝ๐—ฟ๐—ผ๐—ฏ๐—น๐—ฒ๐—บ? This data is heterogeneous (all different and inconsistent), different cameras, different tasks, different environments. Most robot systems donโ€™t know how to learn from it properly. ๐—ฆ๐—ผ๐—น๐˜‚๐˜๐—ถ๐—ผ๐—ป? Researchers at UC San Diego provide a recipe for collecting and using egocentric data (first-person human video showing what the person sees and does). They combine two kinds of human data: In-the-wild videos โ€” everyday first-person footage On-task videos โ€” videos that match robot manipulation tasks These form a new dataset called PHSD. They then train a model called Human0, which learns from video, hand motion, and language instructions. ๐—›๐—ผ๐˜„ ๐—ถ๐˜ ๐˜„๐—ผ๐—ฟ๐—ธ๐˜€ (super simplified): The robot โ€œwatchesโ€ first-person human videos. It learns how hands move and how objects respond. With a small amount of robot data, it can perform many tasks by itself. ๐—ช๐—ต๐˜† ๐˜๐—ต๐—ถ๐˜€ ๐—บ๐—ฎ๐˜๐˜๐—ฒ๐—ฟ๐˜€? Robots can learn more skills without needing tons of robot demonstrations. They understand human actions better. They become more general and robust ๐—ž๐—ฒ๐˜† ๐—ฟ๐—ฒ๐˜€๐˜‚๐—น๐˜๐˜€ Human0 can follow language instructions even when trained mostly on human video. It learns new tasks with very little robot data (few-shot). Mixing in-the-wild + on-task data makes the robot more reliable. This is a step toward robots that learn the way humans do, by watching real people in real situations. ๐Ÿ“– Paper: https://lnkd.in/erJqxvyh ๐ŸŒProject Site: https://lnkd.in/e5Nh8SHU ๐Ÿ‘ฅ Authors: Xiongyi Cai, Ri-Zhao Qiu, Geng Chen, Lai Wei, Isabella Liu, Tianshu Huang, Xuxin Cheng, Xiaolong Wang --------------------------------------------------------------------------------- Udacityโ€™s Fully online ๐— ๐—ฎ๐˜€๐˜๐—ฒ๐—ฟโ€™๐˜€ ๐—ถ๐—ป ๐—”๐—œ*: project-based, and built for real industry skills*--> https://lnkd.in/eTMFKN72 55% OFF -> Code โ€œBLACKFRIDAY55โ€ ---------------------------------------------------------------------------------- Thank you Ri-Zhao Qiu for the permission to use the video.
LocoTouch: teaching quadruped robots to feel and carry objects safely | Sina Pourghodrat (PhD) posted on the topic | LinkedIn
๐™‡๐™ค๐™˜๐™ค๐™๐™ค๐™ช๐™˜๐™: teaching quadruped robots to โ€œfeelโ€ while carrying objects Quadruped robots are getting better at walking and climbing stairs. But thereโ€™s one big challenge we donโ€™t talk about enough: ๐˜”๐˜ฐ๐˜ด๐˜ต ๐˜ญ๐˜ฆ๐˜จ๐˜จ๐˜ฆ๐˜ฅ ๐˜ณ๐˜ฐ๐˜ฃ๐˜ฐ๐˜ต๐˜ด ๐˜ค๐˜ข๐˜ฏโ€™๐˜ต ๐˜ง๐˜ฆ๐˜ฆ๐˜ญ ๐˜ธ๐˜ฉ๐˜ข๐˜ต ๐˜ต๐˜ฉ๐˜ฆ๐˜บโ€™๐˜ณ๐˜ฆ ๐˜ค๐˜ข๐˜ณ๐˜ณ๐˜บ๐˜ช๐˜ฏ๐˜จ. If you put a box, or a fragile object it has no idea how the load is shifting. This makes transport tasks unsafe and unstable. LocoTouch developed by researchers at Carnegie Mellon University, University of Washington, and Google DeepMind tries to solve this problem. ๐—ง๐—ต๐—ฒ ๐—ฐ๐—ผ๐—ฟ๐—ฒ ๐—ถ๐—ฑ๐—ฒ๐—ฎ (simplified) LocoTouch adds tactile sensing to a quadruped robot so it can feel the forces from the object itโ€™s carrying. And then they train a reinforcement learning (RL) policy that learns how to move dynamically while reacting to those forces. The combination is powerful: The robot senses pressure and force changes on its back RL learns how to adjust its gait, balance, and speed The robot can safely move even when the load shifts suddenly Basically: LocoTouch = โ€œtouch + RL locomotionโ€ โ†’ safe dynamic transport. With LocoTouch, the robot can transfer zero-shot (performing new tasks without specific training) from simulation to the real world and safely transport many everyday objects with good robustness and agility. ------------------------------------------------------------------------------- ๐Ÿ“ฐPaper: https://lnkd.in/gyS4nNJT ๐ŸŒProject Page: https://lnkd.in/gzze498A ๐Ÿง‘โ€๐Ÿ’ปCode: https://lnkd.in/gsSG54Nx ๐Ÿ‘ฅ Authors: Changyi Lin, Yuxin Ray Song, Boda Huo, Mingyang Yu, Yikai Wang, Shiqi Liu, Yuxiang Yang, Wenhao Yu, Tingnan Zhang, Jie Tan, Yiyue Luo, Ding Zhao ----------------------------------------------------------------------------------- Advance your career with an accredited ๐— ๐—ฎ๐˜€๐˜๐—ฒ๐—ฟโ€™๐˜€ ๐—ถ๐—ป ๐—”๐—œ*: flexible, project-based, and built for real industry skills*: https://lnkd.in/eTMFKN72 40% off ๐˜ธ๐˜ช๐˜ต๐˜ฉ ๐˜ค๐˜ฐ๐˜ฅ๐˜ฆ โ€œSINA40โ€ *Thank you Udacity for sponsoring this post *This post contains an optional affiliate link that supports my work at no extra cost to you. -----------------------------------------------------------------------------------

List of Benefits for Paid Members: https://www.robotixwithsina.com/benefits-for-paid-members/

Access your benefit if you are a paid member: https://www.robotixwithsina.com/resources-for-paid-members/


Share your total compensation details to help build a transparent resource for robotics professionals*+

Total Compensation - Robotics
Share your total compensation details (base salary, bonuses, RSUs) to help build a transparent resource for robotics professionals. Submissions are anonymous and will help others understand industry standards. By submitting, you confirm that your information doesnโ€™t violate any confidentiality agreements or policies. Contributors get free access to the final databaseโ€”thank you for your support! (you can email me separately after you submitted this form: robotixwithsina@gmail.com)

*Submissions can be anonymous. By submitting, you confirm that your information doesnโ€™t violate any confidentiality agreements or policies. 

+Contributors get free access to the final database.

โš ๏ธ This Newsletter issue is sponsored by Udacity. This page includes an affiliate link to support my work at no extra cost to you, but youโ€™re never obligated to use it.