top of page

Data Engineer

 

COTH Technologies, Inc. is a growing company that provides innovative business, IT consultation to our client to meet their diverse technology needs. With new innovation and analytical strategies, COTH Technologies implement best IT solution to better serve their business needs COTH’s objective is to provide services that meets customers’ needs and exceeds their expectations. Our core values is to deliver highest, excellent IT solution services to our client right always, in fairness, Courtesy , honest environment and providing a respectful, fun, challenging, and learning place of employment. We actively recruit for dedicated, technically competent individuals and partners that are self-motivated while performing exceptionally well in a collaborative team effort. In Frederick, MD.

COTH Technologies, Inc is a growing software development and research team, is seeking an experienced Data Engineer to support our cloud team.

Job Description:

Understanding of software engineering processes and system development life cycles. Strong problem solving skills and capability to understand and set direction for complex technology integration. Good interpersonal, written and oral communication skills. Strong teamwork skills and capability to lead teams. Experience with developing data pipeline using HIVE/SQL, Spark, Certified AWS Developer is highly preferred. Good understanding and hands on exp required on Python and Pyspark

Develop ETL/Data Integration pipelines among various source systems and AWS Data Lake

  • Use ETL Tools such as Data migration services, Glue, Spark/Python/Scala to design, develop, test and implement data pipelines that are developed as part of the application functionality

  • Design new data flows, maps, plans and implement them on AWS

  • Manage and maintain existing and new business rules using ETL or rules engine and test and integrate them in the data pipelines

  • Work with source side teams in understanding data models, and source data requirements to create staging and final Data Lake Data models and create HLDs, LLDs for the required data models

  • Use SQL skills to query data and understand the data relationships. Also use ad-hoc querying skills to understand the data flows, data transformations and data reconciliation, validation

  • Test the data pipelines in development and QA environment

  • Consult and work with multiple teams on daily basis to uncover business needs, data integration and validation requirements.

  • Exposure to AWS Sage Maker is preferable and knowledge of ML model development using Python scripts and libraries is a must.

 

Basic Qualifications:

  • Bachelor’s degree in Computer Science, Computer Engineering or related experience

  • 2+ years of experience as a Data Engineer including building ETL/data pipelines as well as data visualization and BI solutions

  • 2+ years of experience working in a fast-paced environment; continuous deployment, test-driven development, agile methodologies

  • 2+ years’ experience using SQL and advanced SQL techniques

  • 2+ years’ experience building robust, highly available, and scalable services

  • 2+ years’ experience building and deploying services in the cloud

  • Demonstrated track record of scaling data warehouse solution, meeting SLAs, and expertise in performance analysis

  • Demonstrated strategic thinking and ability to anticipate the downstream costs of critical engineering decisions

 

Preferred Qualifications:

  • 2+ years’ experience building consumer-facing products

  • 2+ years of experience with SQL and databases such as Snowflake, Redshift, Postgres

  • 2+ years’ experience working with ETL tools such as Matillion, Informatica, Talend, Alooma, or comparable experience with Python, Java, Scala, Hadoop or Spark

  • 2+ years of experience with Business Intelligence and Visualization tools such as Chartio, Tableau, etc.

  • 2 + years of experience working with automated build and continuous integration systems

  • 2+ years’ experience with database architecture, design and modeling and working with a variety of data warehousing systems

bottom of page