- Location: St. Louis, MO
- Remote: Remote
- Type: Contract
- Job #9710
Technology Partners is currently seeking a talented Data Engineer (147925). Do you have experience with Python and Java? Let us help you make your next big career move a reality!
What You Will Be Doing:
The Data Engineer will be involved in the design of big data solutions that leverage open source and cloud-based solutions within the Location360 enterprise initiative and will work with multiple teams across the organization (i.e. cloud analytics, data architects, business groups). The Data engineer will participate in the building of large-scale data processing systems and APIs and should be able to work with the latest open-source technologies.
Key responsibilities:
- Design, build and support of cloud and open-source systems to process data assets via an API-based platform
- Partners with other internal development communities to bring needed data sets into the asset and making data available to the Bayer Enterprise and internal development communities
- Experience working with ERP/SAP systems
- Building highly scalable APIs and associative architecture to support thousands of requests per second
- Being able to work across multiple teams internal/external to gather requirements and ensure project development is aligned to those requirements.
- Being able to improve the performance of the existing services and be able to identify the scope for any enhancements.
- Being able to work with parsing, managing, analyzing and making available large sets of data to turn information into insights using multiple platforms.
- Defines and promotes the data warehousing design principles and best practices with regards to architecture and techniques within a fast-paced and complex business environment.
- Working at all stages of the software life cycle: Proof of Concept, MVP, Production, and Deprecation
Required Skills & Experience:
- BSc degree in Computer Science or relevant job experience.
- Minimum of 5-year experience with Python/Java development languages.
- Knowledge in different programming or scripting languages like Python Scala, Go, Java, Javascript, R, SQL, Bash.
- Experience developing HTTP APIs (REST and/or GraphQL) that serve up data in an open-source technology, preferably in a cloud environment.
- Ability to build and maintain modern cloud architecture, e.g. AWS, Google Cloud, etc.
- Proven experience working with ETL concepts of data integration, consolidation, enrichment, and aggregation. Design, build and support stable, scalable data pipelines or ETL processes that cleanse, structure and integrate big data sets from multiple data sources and provision to integrated systems and Business Intelligence reporting.
- Experience working with PostgreSQL/PostGIS.
- Experience with streaming sensor/IoT data, e.g. Kafka.
- Experience with code versioning and dependency management systems such as GitHub, SVT, and Maven.
- Proven success utilizing Docker to build and deploy within a CI/CD Environment, preferably using Kubernetes.
Desired Skills & Experience:
- MSc in Computer Science or related field.
- Knowledge on open-source geospatial tech stack such as geoserver.
- Highly proficient (6 years) in Python
- Experience working with customers/other developers to deliver full-stack development solutions e.g collect software, data, and timeline requirements in an Agile environment.
- Demonstrated knowledge on manufacturing facilities and logistics planning.
- Experience working with SAP/ERP systems
- Experience on implementing scientific models or simulations.
- Demonstrated knowledge of agriculture and/or agriculture-oriented businesses.
- Experience implementing complex data projects with a focus on collecting, parsing, managing, and delivery of large sets of data to turn information into insights using multiple platforms.
- Experience developing schema data models in a data warehouse environment.
- Demonstrated experience adapting to new technologies.
- Capable to decide on the needed hardware and software design needs and act according to the decisions. The big data engineer should be able to develop prototypes and proof of concepts for the selected solutions.
- Experience with object-oriented design, coding and testing patterns as well as experience in engineering (commercial or open source) software platforms and large-scale data infrastructures should be present.
- Experience creating cloud computing solutions and web applications leveraging public and private API’s.
- Proven experience (5 years) with distributed systems, e.g. Argo, Kubernetes, Spark, distributed databases, grid computing.
- Proficient (7+ years) working in a Command Line Interface system e.g Docker, Argo, K8s, AWS CLI, GCloud, pSQL, SSH
All offers of employment at Technology Partners are contingent upon clear results of a thorough background check and drug screening that meet corresponding laws and regulations at the city, state and federal level.
We are interested in every qualified candidate who is eligible to work in the United States. However, we are not able to provide sponsorship at this time or accept candidates who would require a corp-to-corp agreement
If this position sounds like you, WE SHOULD TALK!
We realize our people are our most valuable asset, that is why we offer the following benefits:
- Health, Dental, and Vision insurance
- 401(k) retirement plan
- Long and Short-Term disability
- Life insurance
- Direct deposit
- Referral program
Your better future is ready, and we want to put the right tools in your hands to get you there. Let’s go!
Keywords: data engineer, data science, python, kafka, docker,ÿ Argo, K8s, AWS CLI, GCloud, pSQL, SSH
Looking for more opportunities with Technology Partners? Check out technologypartners.net/jobs!
Technology Partners is an Equal Opportunity Employer. Technology Partners does not discriminate on the basis of race, color, religion, sex, national origin, age, disability or any other characteristic protected by applicable state or federal civil rights laws.