We help the world run better
At SAP, we enable you to bring out your best. Our company culture is focused on collaboration and a shared passion to help the world run better. How? We focus every day on building the foundation for tomorrow and creating a workplace that embraces differences, values flexibility, and is aligned to our purpose-driven and future-focused work. We offer a highly collaborative, caring team environment with a strong focus on learning and development, recognition for your individual contributions, and a variety of benefit options for you to choose from.
About the SAP Internship Experience Program:
The SAP Internship Experience Program is SAP’s global, paid internship program that provides university students with the opportunities to find purpose in their careers.
Three reasons to intern at SAP
Culture of collaboration: meet with mentors, make new friends across the globe and create a thriving personal network.
Project-driven experience: gain cross-functional skills from our virtual and in-person learning sessions, diverse subject matter experts, and project deliverables.
Gain visibility: with SAP Internship Experience Program in your title, you’ll have a global network of SAP leaders, entrepreneurs, and career development opportunities at your fingertips.
About BDC & I
BDC&I is a new SAP service delivering Insights, Data Products, and data modeling/engineering/governance tools in a single data platform spanning all SAP applications. The service provides simplified data sharing across application boundaries, and data consumption for analytics and AI. And the data products are well enriched with semantic metadata, easily discoverable from SAP Data Product Catalog. BDC is accessible via a common set of APIs, which can be access from applications including Datasphere (modeling) and SAC (visualization) as well as advanced “Insight” applications.
About Data Foundation Service Department
Data Foundation Services Team is the cornerstone of the BDC Application, building and operating the fundamental services and applications. It includes Customer tenant mapping, Data Product Lifecycle management, Data Ingestion, Data Transformation, Data Sharing, Data privacy and protection, etc. Also we are coming into the AI journey right now and looking for you to join and experience together.
Duties and Responsibilities:
Understand business requirement & design and then implement the code to support cloud customer
Work closely with local and global engineering teams to develop/maintain Data Foundation services, frameworks and applications
Build scalable, stable and efficient platform fundamental services, frameworks and applications
Write quality, testable, maintainable, and well-documented code
Write automation testing including UT, integration test and API test.
Support our global deployment, operation and customer support.
Required Skills & Experience:
BS/MS in Computer Science, Computer Engineering or related technical
Can work for 3+ days a week for at least 6 months or longer duration
Good knowledge of Java & Python language, OOP concepts.
Good knowledge of Cloud native tech stacks like docker, Kubernetes, Argo CD, etc.
Interested in data lake fields like spark, python, metadata, catalog, etc.
Have a solid foundation with relational database and SQL
Understand system architecture
A team player with good communication skills, speaking and writing fluent English
Demonstrate technical curiosity and passion for exploring new technologies or AI related for future technical roadmap via continuous learning and self-development