(#6331707003) Intern, Machine Learning Engineer - PEFT
What You’ll Learn
- Project Overview: In this internship, the candidate will develop SOTA Parameter Efficient Fine Tuning (PEFT) methods and demonstrate them to work for both time series and computer vision foundation models
- Skills You’ll Learn:
- Algorithm development for PEFT methods
- Test PEFT methods on Time series foundation models
- Test PEFT methods on Computer Vision foundation models
What You’ll Do
The team (SDAL) is involved in developing Deep Learning Foundation Models and applying them in Samsung factories for problems related to anomaly detection, classification, segmentation, etc. More specifically, the interest is on two projects: (1) Large Time Series Model (LTM) and (2) Large Vision Model (LVM). The intern will be involved in both projects and will develop novel Parameter Efficient Fine Tuning (PEFT) methods, and demonstrate them to work efficiently on both projects. This will involve applying existing SOTA PEFT methods such as LoRA, SoRA, and to also improve them with novel developments.
Location: Onsite at our San Jose, CA headquarters 5 days a week
Reports to: Sr. Staff Machine Learning Engineer
- Develop PEFT methods and demonstrate them to work efficiently in foundation models
- Apply PEFT methods to Large Time Series Models (LTM) and Large Vision Models (LVM)
- Demonstrate these algorithms to work efficiently for anomaly detection, classification and segmentation and achieve pre-defined project KPIs. These algorithms will later be applied in Samsung’s factory in Korea and will help in improving the yield of production
- Collaborate with 3-4 other engineers at Samsung SSI’s San Jose, CA campus and the work will involve 5 days per week on-site
- Algorithm development using Python 3, PyTorch, CUDA, etc.
- Presentation and documentation of work at end of internship period (12 weeks)
- Complete other responsibilities as assigned
What You Bring
- Pursuing Masters or PhD in Computer Science or Electrical Engineering preferred; other closely-related areas will also be considered
- Must have at least 1 academic quarter or semester remaining
- Good working knowledge of Python 3 and Pytorch required
- Prior experience with training and evaluation of large Deep Learning models such as Transformers required
- Prior experience in LLMs, Vision Transformers, Time Series models or other foundation models preferred
- Prior publications in top-tier AI/ML conferences such as NeurIPS/CVPR/ICLR/ICML preferred
- You’re inclusive, adapting your style to the situation and diverse global norms of our people
- An avid learner, you approach challenges with curiosity and resilience, seeking data to help build understanding
- You’re collaborative, building relationships, humbly offering support and openly welcoming approaches
- Innovative and creative, you proactively explore new ideas and adapt quickly to change
#LI-AD1