Meta is seeking a Research Scientist to join our Llama Applied Multimodal team. We are looking for recognized experts in generative AI, multimodal reasoning, NLP, with experience in areas like multimodal model training; data processing for pretraining and fine-tuning; LLM alignment; reinforcement learning for model tuning; efficient training and inference; image and video generation. The ideal candidate will have an interest in producing and applying new science to help us develop and deploy large multimodal models.
Fundamental Multimodal Research Scientist - Generative AI Responsibilities:
- Lead, collaborate, and execute on research that pushes forward the state of the art in multimodal reasoning and generation research.
- Work towards long-term ambitious research goals, while identifying intermediate milestones. Directly contribute to experiments, including designing experimental details, writing reusable code, running evaluations, and organizing results.
- Mentor other team members. Play a significant role in healthy cross-functional collaboration.
- Prioritize research that can be applied to Meta's product development.
Minimum Qualifications:
- Bachelor's degree in Computer Science, Computer Engineering, relevant technical field, or equivalent practical experience.
- PhD in computer vision, machine learning, NLP, computer science, applied statistics, or other related fields
- Experience writing software and executing complex experiments involving large AI models and datasets.
- Must obtain work authorization in the country of employment at the time of hire, and maintain ongoing work authorization during employment.
Preferred Qualifications:
- Direct experience in generative AI, computer vision and multimodal research.
- First author publications experience at peer-reviewed AI conferences (e.g., CVPR, ICCV, ECCV, NeurIPS, ICML, ICLR, EMNLP, and ACL).
About Meta:
Meta builds technologies that help people connect, find communities, and grow businesses. When Facebook launched in 2004, it changed the way people connect. Apps like Messenger, Instagram and WhatsApp further empowered billions around the world. Now, Meta is moving beyond 2D screens toward immersive experiences like augmented and virtual reality to help build the next evolution in social technology. People who choose to build their careers by building with us at Meta help shape a future that will take us beyond what digital connection makes possible today—beyond the constraints of screens, the limits of distance, and even the rules of physics.
Meta is proud to be an Equal Employment Opportunity and Affirmative Action employer. We do not discriminate based upon race, religion, color, national origin, sex (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender, gender identity, gender expression, transgender status, sexual stereotypes, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics. We also consider qualified applicants with criminal histories, consistent with applicable federal, state and local law. Meta participates in the E-Verify program in certain locations, as required by law. Please note that Meta may leverage artificial intelligence and machine learning technologies in connection with applications for employment.
Meta is committed to providing reasonable accommodations for candidates with disabilities in our recruiting process. If you need any assistance or accommodations due to a disability, please let us know at accommodations-ext@fb.com.
$177,000/year to $251,000/year + bonus + equity + benefits
Individual compensation is determined by skills, qualifications, experience, and location. Compensation details listed in this posting reflect the base hourly rate, monthly rate, or annual salary only, and do not include bonus, equity or sales incentives, if applicable. In addition to base compensation, Meta offers benefits. Learn more about benefits at Meta.