Data Engineer Engineering - Los Angeles, CA at Geebo

Data Engineer

COMPANY OVERVIEW Genesis Capital is one of the largest private money lenders in the country, focused on providing commercial real estate financing solutions to real estate developers who buy, renovate, and sell single-family residential real estate and/or multi-family apartment buildings.
Genesis had been acquired by New Residential Investment Corp, a public REIT owned by Fortress Investment Corp Fortress Investment Group LLC, a leading, highly diversified global investment manager.
SUMMARY The Data team lead efforts to leverage the data we hold to maximize value for our clients and assist the rest of Genesis Capital in making the most informed decisions possible.
As a part of the team, you will be responsible for contributing to an ecosystem that handles data ingress/egress -interfacing with various sources/destinations.
PRIMARY RESPONSIBILITIES Build and operationalize complex data solutions, data pipelines, processes, apply transformations, and recommending data cleansing/quality solutions.
Collaborate with application support and business teams to build tools and data marts that enable analytics.
Proactively monitor data pipelines for completeness and accuracy, communicating with impacted stakeholders to resolve issues when they are identified.
Support in identifying, designing, and implementing internal process improvements:
automating manual processes using RPA tool like Power Automate.
Work independently with general direction and flexibility in a fast-paced environment.
To perform this job successfully, an individual must be able to perform each essential duty satisfactorily.
The requirements listed below are representative of the knowledge, skill, and/or ability required.
Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions.
The right candidate should have 5
years of relevant experience and possess the following qualifications and
Experience:
BS/MS in Information Technology or related field.
Orchestrating and scheduling data jobs using Astronomer Airflow.
Hands-on development experience in Python.
Building data models and data marts within a data warehouse using DBT.
Experience with Docker and the broader Docker ecosystem.
Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases like MS SQL Server, PostgreSQL.
Knowledge of AWS cloud-based environments and its services like S3, Kinesis, EMR, GLUE, Lambda, Data pipeline, RDS and Redshift.
Experience of using Apache Spark in Data Pipeline.
Experience working in an agile delivery environment with working knowledge of continuous integration/continuous delivery (CI/CD) and DevOps practices.
Experienced with Atlassian Suite (JIRA, Confluence,Bitbucket) and version control system (eg.
Git) ADDITIONAL SKILLS REQUIRED Strong business acumen Projects an encouraging and positive attitude and works well within a supportive team environment Takes initiative with a clear business purpose in mind Adapts to change easily in a fast-paced work environment Follows direction easily Problem solver Strong time management skills Team Player Compensation Range:
$110,000 - $120,000 Recommended Skills Agile Methodology Amazon Redshift Amazon Relational Database Service Automation Bitbucket Build Tools Estimated Salary: $20 to $28 per hour based on qualifications.

Don't Be a Victim of Fraud

  • Electronic Scams
  • Home-based jobs
  • Fake Rentals
  • Bad Buyers
  • Non-Existent Merchandise
  • Secondhand Items
  • More...

Don't Be Fooled

The fraudster will send a check to the victim who has accepted a job. The check can be for multiple reasons such as signing bonus, supplies, etc. The victim will be instructed to deposit the check and use the money for any of these reasons and then instructed to send the remaining funds to the fraudster. The check will bounce and the victim is left responsible.