[SFS] AWS Expert/Lead in Englewood, CO

Alexander Arce axa6849 at gmail.com
Mon Apr 15 14:55:05 MDT 2019


Hi all,

Below is a forwarded message from Michelle Sedita:
Email: michelle.sedita at mondo.com
direct line: 720-263-2909


THE COMPANY is in the process of moving from a traditional data warehouse
environment to a cloud-based infrastructure. The ideal candidate provides
thought leadership to drive innovative, scalable and high-performing
solutions. This role requires an individual who works well in a highly
collaborative environment with fluid requirements, and enjoys building data
pipelines to transform data into usable formats. This includes leveraging
both a data lake and data warehouse. This position is responsible for
architecting, modeling, developing, implementing, maintaining, and
troubleshooting the cloud data warehouse and data infrastructure strategy.
This position serves as liaison and advisor to team members and business
partners during new project initiatives, helping to mitigate risk, and
acting as Subject Matter Expert for process and data changes.


Essential Duties and Responsibilities:
Design, develop, and implement the cloud warehouse pipeline for optimal
extraction, transformation, and loading of data from a wide variety of data
sources using a broad array of AWS technologies.
Assemble large, complex data sets that meet functional business
requirements and maximize the strategic value of the data.
Collaborate with business users to understand their analytic objectives and
business needs, and translate the objectives into technical specifications
and data warehouse solutions.
Analyze data from multiple data sources and develop processes to integrate
the data into a single but consistent view.
Client, recommend and implement ways to improve/ensure data reliability,
efficiency, and quality.
Identify, design, and implement internal process improvements: automating
manual processes, optimizing data delivery, re-designing infrastructure for
greater scalability, etc.
Lead and perform technical work activities including logical and physical
data modeling, data flows, code development, stored procedures, performance
monitoring and tuning, problem support, and technical troubleshooting.
Coordinate the efforts of many technical resources to ensure the solution
is implemented timely and precisely.
Responsible for fully documenting the data warehouse environment and
operational components.
Take on a variety of roles when necessary.
QUALIFICATIONS:
Bachelor's degree in computer science or related field is preferred.
Applicable AWS certification such as Big Data Specialty, Architect, or
Developer preferred.
10 or more years' experience in the area of Business Intelligence, Data
Warehousing, Architecture, and/or Development.
Demonstrated understanding of concepts, best practices and functions of a
data warehouse in the corporate environment, including data discovery, data
cleansing, dimensional modeling/data warehousing and relational databases.
Expert level in Python and PowerShell.
Experienced in .Net.
Strong analysis skills required.
Strong SQL skills required.
Expertise with Snowflake a big plus. Tacit knowledge of Redshift and Google
BigQuery.
Expert level SQL and JSON handling. Experience working with binary columnar
data formats, e.g. Parquet.
Spark experience a plus, ideally with EMR or Glue. Python with PySpark a
plus. Experience automating 3rd party CLI tools, shell scripting, and
integrating with API's for data acquisition.
Experience with versioning tools such as TFS, Git or AWS CodeCommit
required.
Strong communication and team building skills.

>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.sofree.us/pipermail/sfs/attachments/20190415/30a87f26/attachment.html>


More information about the SFS mailing list