Loading ...

Job content

Minimum qualifications:

  • PhD degree in Computer Science, related technical field, or equivalent practical experience.
  • Experience working with Python or C++ and Machine Learning libraries.
  • Publication track record in conferences (i.e., CVPR, SIGGRAPH, NeurIPS, ECCV, ICCV, ICML).


Preferred qualifications:

  • Experience in pose models or related topics.
  • Experience with deploying research into production.
  • Track record of driving impactful research agendas.

About the job

As an organization, Google maintains a portfolio of research projects driven by fundamental research, new product innovation, product contribution and infrastructure goals, while providing individuals and teams the freedom to emphasize specific types of work. As a Research Scientist, you’ll setup large-scale tests and deploy promising ideas quickly and broadly, managing deadlines and deliverables while applying the latest theories to develop new and improved products, processes, or technologies. From creating experiments and prototyping implementations to designing new architectures, our research scientists work on real-world problems that span the breadth of computer science, such as machine (and deep) learning, data mining, natural language processing, hardware and software performance analysis, improving compilers for mobile platforms, as well as core search and much more.

As a Research Scientist, you’ll also actively contribute to the wider research community by sharing and publishing your findings, with ideas inspired by internal projects as well as from collaborations with research programs at partner universities and technical institutes all over the world.

Our research group focuses on developing next generation technologies to facilitate realistic data generation, including accurate reconstruction and tracking, physically based animation and simulation, data-driven generative modeling, differential and neural image synthesis - to name just a few research topics. Our team is composed of experts from various disciplines, including Computer Vision, Computer Graphics, and Machine Learning.

We are deeply engaged with academia and product teams alike, yielding cutting edge research publications, as well as stunning product features for users around the globe.

Google’s mission is to organize the world’s information and make it universally accessible and useful. Our Devices & Services team combines the best of Google AI, Software, and Hardware to create radically helpful experiences for users. We research, design, and develop new technologies and hardware to make our user’s interaction with computing faster, seamless, and more powerful. Whether finding new ways to capture and sense the world around us, advancing form factors, or improving interaction methods, the Devices & Services team is making people’s lives better through technology.

Responsibilities

  • Conduct cutting edge research and publish at venues to advance the visual data generation, such as generative modeling, neural representations, neural image synthesis, as well as domain adaptation and data augmentation.
  • Develop next generation technologies to capture, model, animate, simulate, and synthesize data with a focus on pose models.
  • Collaborate cross-functionally with other researchers, engineers, and technical artists, as well as product teams.
  • Advise and guide junior researchers and PhD students.
Google is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. See also Google’s EEO Policy and EEO is the Law. If you have a disability or special need that requires accommodation, please let us know by completing our Accommodations for Applicants form.
Loading ...
Loading ...

Deadline: 04-05-2024

Click to apply for free candidate

Apply

Loading ...
Loading ...

SIMILAR JOBS

Loading ...
Loading ...