Skip to content
Logo Vpixx
  • Hardware
    • CRT Replacement
    • Projector
    • Video I/O Hub
    • Eye Tracker
    • Accessories
  • Software
    • LabMaestro Simulator
    • LabMaestro Pack&Go
  • About Us
    • Who We Are
    • What We Believe
    • Customer Relationship
    • Distributors
    • Partners
  • Support
  • Contact
  • Learn with VOCAL
  • LOGIN
Menu Close
  • Hardware
    • CRT Replacement
    • Projector
    • Video I/O Hub
    • Eye Tracker
    • Accessories
  • Software
    • LabMaestro Simulator
    • LabMaestro Pack&Go
  • About Us
    • Who We Are
    • What We Believe
    • Customer Relationship
    • Distributors
    • Partners
  • Support
  • Contact
  • Learn with VOCAL
  • LOGIN

Coordinate frames and shape perception in neural nets (V-VSS 2022 Keynote Lecture)

This keynote lecture was given by Dr. Geoffrey Hinton, professor emeritus at the University of Toronto, engineering fellow at Google Research, and chief scientific adviser at (and co-founder of) the Vector Institute for Artificial Intelligence in Toronto, on June 1st, 2022 as part of the annual meeting of the Vision Sciences Society.
  • Conference talkSponsored talk

Date published: February 1, 2023

  • Overview
  • Dr. Geoffrey Hinton
  • Coordinate frames and shape perception in neural nets

Overview

VPixx is proud to sponsor the keynote lecture for the 2022 virtual meeting of the Vision Sciences Society. This talk was originally presented on June 1st, 2022 on Zoom. 

Dr. Geoffrey Hinton

University Professor Emeritus at the University of Toronto; Engineering Fellow at Google Research; and Chief scientific adviser at (and co-founder of) the Vector Institute for Artificial Intelligence in Toronto

Geoffrey Hinton Ph.D., Geoffrey Hinton received his PhD in Artificial Intelligence from Edinburgh in 1978. After five years as a faculty member at Carnegie-Mellon he became a fellow of the Canadian Institute for Advanced Research and moved to the Department of Computer Science at the University of Toronto where he is now an emeritus professor. He is also a VP Engineering fellow at Google and Chief Scientific Adviser at the Vector Institute.

Geoffrey Hinton was one of the researchers who introduced the backpropagation algorithm and the first to use backpropagation for learning word embeddings. His other contributions to neural network research include Boltzmann machines, distributed representations, time-delay neural nets, mixtures of experts, variational learning and deep learning. His research group in Toronto made major breakthroughs in deep learning that revolutionized speech recognition and object classification.

Geoffrey Hinton is a fellow of the UK Royal Society and a foreign member of the US National Academy of Engineering and the American Academy of Arts and Sciences. His awards include the David E. Rumelhart prize, the IJCAI award for research excellence, the Killam prize for Engineering, the IEEE Frank Rosenblatt medal, the NSERC Herzberg Gold Medal, the IEEE James Clerk Maxwell Gold medal, the NEC C&C award, the BBVA award, the Honda Prize and the Turing Award.

To learn more about Professor Geoffrey Hinton and his research, please visit his website.

Coordinate frames and shape perception in neural nets

Cite this guide

Hinton, G. (2022, June 1). Coordinate frames and shape perception in neural nets (V-VSS 2022 Keynote Lecture) [Video file]. Retrieved from https://vpixx.com/vocal/coordinate-frames-and-shape-perception-in-neural-nets-v-vss-2022-keynote-lecture

Need help or more information?

Login to your account to access product documentation and more.

Go to MyVPixx

Talk to us

(514) 328-7499
1 (844) 488-7499 – Toll Free USA/Canada

VPixx Technologies Inc.
630 Clairevue West, suite 301
Saint-Bruno, QC Canada, J3V 6B4

Our products

CRT Replacement Projector Video I/O Hub Eye Tracker Accessories
Logo Vpixx

Menu

Hardware Software About Us Support Contact Terms & Privacy

Our newsletter

Stay informed about our New Products, Software Updates and much more! 

Subscribe to our newsletter
© 2022 VPixx Technologies Inc. – All rights reserved