
Description
WHAT YOU DO AT AMD CHANGES EVERYTHING
At AMD, our mission is to build great products that accelerate next-generation computing experiences – from AI and data centers, to PCs, gaming and embedded systems. Grounded in a culture of innovation and collaboration, we believe real progress comes from bold ideas, human ingenuity and a shared passion to create something extraordinary. When you join AMD, you'll discover the real differentiator is our culture. We push the limits of innovation to solve the world's most important challenges—striving for execution excellence, while being direct, humble, collaborative, and inclusive of diverse perspectives. Join us as we shape the future of AI and beyond.
Together, we advance your career.
THE ROLE
- To lead the team's pretraining efforts for our open source, multilingual language model pretraining, supporting the expansion of open-source AI support for long tail and lower resource languages.
- Plan and execute efficient language model pretraining while managing architecture risk; specific need for experience and interest in efficient training techniques like MoE and fine-grained MoE .
- Provide technical leadership and drive team's approach to model architecture, stay up to date on current relevant research, experimentally evaluate new techniques, and drive or implement feature work to support training desired architecture.
- This team is crucial in building the role of Silo AI as a thought leader and foreground actor in developing and sharing multilingual open-source language models.
Collaboration with others
- This role will provide technical leadership for our model pretraining efforts, coordinating and directing work related to model architecture
- Provide input and collaborate with data teams on data pretraining mixes, synthetic data, mid-training data, etc.
- Provide feedback to and coordinate efforts with OpenEuroLLM model architecture work package.
Main goals for first 6 months
- Establish a research and development roadmap which involves experimental work applied to the current model building activities in the Base Models team, with an emphasis on efficient training techniques and advanced model architectures.
- Coordinate efforts with OpenEuroLLM architecture research workstreams.
- Publish position papers and white papers, release first new experimental models.
THE CANDIDATE
- Experience training large language models, especially MoE models.
- Comfort working as a “full stack” researcher, familiarity with the full model training stack.
- A record of achieving engineering successes and of research or research adjacent activities, such as. publications, patents, or open source releases; academic or industrial collaborations.
- A clear and documented commitment to open source excellence and software craftmanship.
- A willingness to take technical risks and meet technical challenges head-on.
- Experience from industrial research labs and of software engineering processes.
- Demonstrate communication and leadership skills.
We would like to see
- Mathematical and statistical competence beyond the expected for computer science engineers.
- Understanding of linguistic behavior and linguistic data.
- Working knowledge of more than one language.
- Research experience, e.g. a postgraduate degree in a relevant field.
LOCATION: Remote from Finland, UK, Germany, Denmark or Netherlands
#LI-DB1
#LI-REMOTE
Benefits offered are described: AMD benefits at a glance.
AMD does not accept unsolicited resumes from headhunters, recruitment agencies, or fee-based recruitment services. AMD and its subsidiaries are equal opportunity, inclusive employers and will consider all applicants without regard to age, ancestry, color, marital status, medical condition, mental or physical disability, national origin, race, religion, political and/or third-party affiliation, sex, pregnancy, sexual orientation, gender identity, military or veteran status, or any other characteristic protected by law. We encourage applications from all qualified candidates and will accommodate applicants' needs under the respective laws throughout all stages of the recruitment and selection process.
Apply on company website