Category Archives: Uncategorized

Lizzo – Juice (Official Video)

New single “JUICE” out now! http://bit.ly/2VoIDuN

WATCH, SIP, REPEAT 🥤💦

Subscribe for more content from Lizzo:
http://bit.ly/2Qpl4Oy

Director: Quinn Wilson
Video Commissioner: Emmanuelle Cuny
Executive Producer: Alli Maxwell
Associate Director, Video Administration: Lily F Thrall
Assistant, Video Production: Trevor Joseph Newton

Follow Lizzo
http://bit.ly/2Vw9C7v

http://bit.ly/2QofZGu
http://bit.ly/1rCtSxo

http://lizzomusic.com

The official YouTube channel of Atlantic Records artist Lizzo. Subscribe for the latest music videos, performances, and more.

Born in Houston and raised in Detroit, Lizzo adopted her moniker in 2011 and fronted Lizzo & the Larva Ink after moving to Minneapolis. Her work began to intertwine with the city’s indie scene, allowing her to work with artists like Gayngs and Doomtree.

Since then, Lizzo has collaborated with a variety of creatives–Clean Bandit, Bastille, and Big Freedia, to name a few–and was named one of Forbes Magazine’s 2018 “30 Under 30”. Her top singles “Good As Hell” and “Truth Hurts” have gained over 34.5 million Spotify streams combined. In addition to headlining her own Good As Hell tour in 2017, Lizzo joined Haim on the Sister Sister Sister tour in 2018.

via YouTube

Mayor Mike Duggan Keynote Address | #MPC17

Detroit is truly the comeback city. From booming development and a robust entrepreneurial ecosystem to efforts to reduce crime and blight, the city’s revitalization is a shining example for metropolitan communities across the country. But there is still more work to do. As his first term nears completion, Detroit Mayor Mike Duggan will reflect on the progress being made in the city and ongoing collaboration needed to ensure all of its citizens benefit from Detroit’s rebirth.

Interviewer: Paul W. Smith, Host, WJR NewsTalk 760 AM

via YouTube

Turning 2D into depth images

Most cameras just record colour but now the 3D shapes of objects, captured through only a single lens, can be accurately estimated using new software developed by UCL computer scientists. The method, published at CVPR 2017, gives state-of-the-art results and works with existing photos, allowing any camera to map the depth for every pixel it captures. The technology has a wide variety of applications, from augmented reality in computer games and apps, to robot interaction, and self-driving cars. Historical images and videos can also be analysed by the software, which is useful for reconstruction of incidents or to automatically convert 2D films into immersive 3D.

Inferring object-range from a simple image by using real-time software has a whole host of potential uses. Depth mapping is critical for self-driving cars to avoid collisions, for example. Currently, car manufacturers use a combination of laser-scanners and/or radar sensors, which have limitations. They all use cameras too, but the individual cameras couldn’t provide meaningful depth information.

The new software was developed using machine learning methods and has been trained and tested in outdoor and urban environments. It successfully estimates depths for thin structures such as street signs and poles, as well as people and cars, and quickly predicts a dense depth map for each 512 x 256 pixel image, running at over 25 frames per second.

Currently, depth mapping systems rely on bulky binocular stereo rigs or a single camera paired with a laser or light-pattern projector that don’t work well outdoors because objects move too fast and sunlight dwarfs the projected patterns.

There are other machine-learning based systems also seeking to get depth from single photographs, but those are trained in different ways, with some needing elusive high-quality depth information. The new technology doesn’t need real-life depth datasets, and outperforms all the other systems. Once trained, it runs in the field by processing one normal single-lens photo after another.

Understanding the shape of a scene from a single image is a fundamental problem. A 360 degree depth map would be fantastically useful – it could drive wearable tech to assist disabled people with navigation, or to map real-life locations for virtual reality gaming, for example.

At the moment, the software requires a desktop computer to process individual images, but they plan on miniaturising it, so it can be run on hand-held devices such as phones and tablets, making it more accessible to app developers. It has also optimised only for outdoor use, so the next step is to train it on indoor environments.

The team has patented the technology for commercial use through UCL Business, but has made the code free for academic use. Funding for the research was kindly provided by the Engineering and Physical Sciences Research Council.

via YouTube

Efficient and Precise Interactive Hand Tracking

See an example of 3D hand tracking research from Microsoft, in proceedings at SIGGRAPH 2016. Researchers created virtual controls that are thin enough that you can touch your fingers together to get an experience of touching something hard. They also developed sensory experiences that allow people to push against something soft and pliant rather than hard and unforgiving, which appears to feel more authentic. Read more at: http://bit.ly/29bdUNM

via YouTube

How a fleet of wind-powered drones is changing our understanding of the ocean | Sebastien de Halleux

Our oceans are unexplored and undersampled — today, we still know more about other planets than our own. How can we get to a better understanding of this vast, important ecosystem? Explorer Sebastien de Halleux shares how a new fleet of wind- and solar-powered drones is collecting data at sea in unprecedented detail, revealing insights into things like global weather and the health of our fish stocks. Learn more about what a better grasp of the ocean could mean for us back on land.

Check out more TED Talks: http://www.ted.com

The TED Talks channel features the best talks and performances from the TED Conference, where the world’s leading thinkers and doers give the talk of their lives in 18 minutes (or less). Look for talks on Technology, Entertainment and Design — plus science, business, global issues, the arts and more.

Follow TED on Twitter: http://www.twitter.com/TEDTalks
Like TED on Facebook: http://bit.ly/AaF5HI

Subscribe to our channel: https://www.youtube.com/TED

via YouTube

The incredible potential of flexible, soft robots | Giada Gerboni

Robots are designed for speed and precision — but their rigidity has often limited how they’re used. In this illuminating talk, biomedical engineer Giada Gerboni shares the latest developments in “soft robotics,” an emerging field that aims to create nimble machines that imitate nature, like a robotic octopus. Learn more about how these flexible structures could play a critical role in surgery, medicine and our daily lives.

Check out more TED Talks: http://www.ted.com

The TED Talks channel features the best talks and performances from the TED Conference, where the world’s leading thinkers and doers give the talk of their lives in 18 minutes (or less). Look for talks on Technology, Entertainment and Design — plus science, business, global issues, the arts and more.

Follow TED on Twitter: http://www.twitter.com/TEDTalks
Like TED on Facebook: http://bit.ly/AaF5HI

Subscribe to our channel: https://www.youtube.com/TED

via YouTube

The Curious World of Synaesthesia | Jamie Ward | TEDxCambridgeUniversity

In this talk, Jamie delves into the little known world of synaesthesia – a biological condition that causes people’s senses to ‘join up’ when stimulated. By looking at the experiences of synaesthetes, we can understand how their different sensory systems are entangled, consider what these entanglements mean for the creation of artistic masterpieces, and ponder on the broader nature of the human sensory experience.

Jamie Ward is currently professor of Cognitive Neuroscience at the University of Sussex.

This talk was given at a TEDx event using the TED conference format but independently organized by a local community. Learn more at http://ted.com/tedx

via YouTube