CAREERS

ETL Developer

No of Positions: 2

Experience : 5+ years of data warehouse experience

Industry :IT Software/Software Services

Functional Area : IT Software-DBA, Data Warehousing

Role : ETL Developer


Technical Skills Required :

  • 5+ years of data warehouse experience
  • 5+ years of ETL (Extract, Transform, Load) Programming experience
  • 3+ years of application production support experience
  • 2+ years of experience leading the requirement design process with both technical and business partners
  • 3 + years of experience gathering business requirements from various sources including end-users and stakeholders
  • 3+ years of experience with databases such as Oracle, DB2, SQL server, or Teradata
  • 2+ years of experience with Hadoop based data sets in  Parquet, Hive, HBASE

Roles and Responsibilities :

  • The ETL Data Integration Specialist will be responsible for working closely with the Data Architects, Data Analyst's and Power BI Development teams to produce ETL solutions in support of data warehousing and data management that follow industry best practices and company standards.
  • The ETL Specialist will have strong programming and analytical skills with the ability to review, analyze and understand business requirements and internal database source and target state architecture. The role will cover full systems development life cycle (SDLC) phases including requirements analysis, system design, development, coding, testing, debugging, implementation, and post-implementation support.
  • Prefer experience in SAP BODS 
  • Strong grasp of advanced SQL writing and query tuning/optimization 
  • Experience with Data Modelling tools like Erwin is preferred 
  • Experience with integraiton with Cloud based data warehouse AWS or Azure.
  • Develop complex data transformations using ELT and ETL approach for BI Data Warehouse models
  • Experience in Hadoop data lake (Cloudera Distribution or HortonWorks) as source for data cleansing and transformation into data warehouse environment

Key Skills : ETL,SAP BODS, Teradata, Informatica, Sql

Desired Candidates Profile :

Education: UG: B.Tech/B.E. - Any Specialization PG:Any Postgraduate - Any Specialization, Post Graduation Not R

Other Key Skills Required:

None

Reporting Developer

No of Positions: 1

Experience : 3+ years experience in Information Technology (or equivalent role)

Industry :IT Software/Software Services

Functional Area : None

Role : Reporting Developer


Technical Skills Required :

  • 3+ years experience in Information Technology (or equivalent role)
  • Hands on experience creating self-service tools/dashboards used to perform what-if analysis, predict future outcomes and highlight drivers of business results.
  • Proficient to expert in writing SQL queries to perform data analysis and/or produce reports for business usage.
  • Tableau or other dash-boarding tool equivalent
  • Advanced data analysis in Excel

Roles and Responsibilities :

None

Key Skills : Tableau,Excel,SQL

Desired Candidates Profile :

Education: Bachelor’s degree Computer Science/Engineering or MIS preferred

Other Key Skills Required:

None

Data Scientist/Modeller

No of Positions: 1

Experience : 3 Or More

Industry :IT Software/Software Services

Functional Area : None

Role : Data Scientist/Modeller


Technical Skills Required :

  • Three (3) or more years of experience in data analysis.
  • SQL or comparable data manipulation expertise required. Traditional relational warehouse experience required.
  • Proficiency in unstructured data mining and data manipulation required. Familiarity with MapReduce programming and ‘big data’ query scripts such as Hive or Pig required.
  • Proficiency with data modeling and statistical techniques and software required (D3.js, R, etc.).
  • Understanding of big data ecosystems and ability to lead ecosystem development.
  • Experience in performing data quality analysis and profiling tasks, including: Source/target assessments, Data model integrity, Redundancy checks, Ranges and outliers Sparsity/density metrics.
  • Advanced data visualization proficiency required (Microstrategy, Tableau, etc.).
  • Strong analytical, organizational, communication and presentation skills required.
  • Strong organization and time management skills – prior experience in leading a small team is preferred
  • Ability to work effectively within a team environment
  • Ability to work effectively in a high energy and a very rapid dynamic environment

Roles and Responsibilities :

  • Utilizes R, Python, or equivalent technologies to perform complex statistical analyses on both internal and external data.
  • Analyzes customer data to identify trends, opportunities, risks, and provide recommendations.
  • Utilizes structured and unstructured data stored on platforms such as HP Vertica, Hadoop (Cloudera), SQL Server, and Oracle Exadata to complete analyses.
  • Obtain, scrub, explore, model and interpret data currently stored in Oracle databases – using SQL and other data mining tools like SAS or SPSS Modeler
  • Designs data models that enable real-time event-based scoring and customer segmentation on our front-end applications.
  • Leverages advanced data mining techniques to extract insights and shares those insights and corresponding recommendations to the appropriate parties.
  • Assists in requirement creation for end-to- end data flow from collection in front-end applications all the way to analytical consumption.
  • Evaluates the effectiveness of customer acquisition and retention activities.
  • Oversees the activities of predictive modelers, and provides expert guidance. Provides business insight to modelers.
  • Utilizes administrative access to both Windows and Linux server environments for analytical development.
  • Designs data structures to support analytical projects.

Key Skills : None

Desired Candidates Profile :

Education: Bachelor’s degree required. Concentration in Management Information Systems, Computer Science, Engineering, Mathematics, Statistics, Predictive Analytics, or Marketing Research preferred.

Advanced – Master’s degree Computer Science or Applied Mathematics or Computational Statistics preferred

Other Key Skills Required:

None

Bigdata Architect

No of Positions: 1

Experience : 8+ years experience in Big data and related technologies.

Industry :IT Software/Software Services

Functional Area : IT Software-DBA, Data Warehousing

Role : Solution Architect


Technical Skills Required :

  • 8+ years experience in Big data and related technologies.
  • Experience installing production grade Hadoop clusters, including configuration of Hadoop services (HBase, Impala, Hive, Hue, HDFS)
  • 3+years experience with production Hadoop platform administration (any of Cloudera, Hortonworks, MapR, Apache Foundation Hadoop) including monitoring tools such as Nagios, Ganglia, Cloudera Manager and Ambari
  • Experience in architecting Hadoop security methodologies,  Performance and capacity management  , Backup and Disaster Recovery and High availability solutions
  • Demonstrated ability in cluster optimization and recommend best practices for a Hadoop cluster
  • Experience in debugging and analyzing Hadoop platform and services issues
  • Knowledge on data ingestion and processing framework along with data security and governance
  • Experience with Hadoop deployments in virtualized environments
  • Should have strong coding skills in scripting languages such as Shell, Perl, Python etc.,
  • Performing Linux level package installation, internal and external repository management
  • Led and mentor a team of highly skilled Hadoop Administrators
  • Design solutions related to complex problems related to HDFS administration and guide team in arriving at right solutions
  • Ability to work with distributed teams in a collaborative and productive manner
  • Utilize industry research to improve Tiquetro’s Big Data environment.

Roles and Responsibilities :

  • Design , Architect and maintain HDFS platforms ecosystem for a stable and efficient application layer integration
  • Work with engineering, data science analytics and application teams to scope, size, install and configure components of HDFS Data platforms
  • Maintain industry expertise by reviewing and conducting POCs on new technologies; and participating in continuing education and training
  • Evaluate right technology needs based on business need and work with team to adapt to those technologies.
  • Provide technical mentorship to associates. Enable team to contribute to open source community.
  • Ensure that business needs are being met by evaluating the ongoing effectiveness of current plans, programs, and initiatives; consult with business partners, managers, coworkers, or other key stakeholders; solicit, evaluate, and apply suggestions for improving efficiency and cost effectiveness.
  • Engage with Technical Experts in setting up priorities for the team and in delivering best services to the customers around the globe
  • Promote and support company policies, procedures, mission, values, and standards of ethics and integrity.

Key Skills : Hadoop,Hive,Java, Spark, No Sql,bigdata, Cloudera,Horton Works, Nagios,Gangalia, Ambari, Shell,Perl,

Desired Candidates Profile :

Education: UG: B.Tech/B.E. - Any Specialization PG:Any Postgraduate - Any Specialization, Post Graduation Not Required Doctorate:Doctorate Not Required

Other Key Skills Required:

  • Complete ownership of task given and a timely, quality delivery
  • Curiosity on current industry trends, technologies, tools and frameworks
  • Appetite to implement bleeding edge technology and adapt to frequent changes
  • Excellent verbal and written communication skills

Bigdata Architect

No of Positions: 1

Experience : 5+ Years in Bigdata

Industry :IT Software/Software Services

Functional Area : IT Software-DBA, Data Warehousing

Role : Solution Architect


Technical Skills Required :

Familiarity with C++11 is a plus - Familiarity with machine learning/deep learning algorithms
Experience with big data frameworks, such as HDFS, Spark, Cassandra, or TensorFlow
Experience with query optimization and product performance improvement
Experience with large-scale distributed systems design preferred
Experience with MySQL preferred
Experience with Linux kernel preferred

Roles and Responsibilities :

As an Architect you are responsible for providing technical leadership to small size/complexity/order-
value projects. You are expected have depth of knowledge of specified technological area, which
includes knowledge of applicable processes, methodologies, standards, products and frameworks. You
would be responsible for defining and documenting architecture, capturing and documenting non-
functional (architectural) requirements, preparing estimates and defining technical solutions to
proposals (RFPs). You should provide technical leadership to project team to perform design to
deployment related activities, provide guidance, perform reviews, prevent and resolve technical issues.

Key Skills : HDFS,Spark,Cassandra,Machine Learning,MySQL,Linux Kernel

Desired Candidates Profile :

Education: Any Degree

Other Key Skills Required:

Minimum work experience: 5 - 8 Years

Location: Houston, TX.
Travel: upto 60%

Apply for job

© Triquetro | 2018

DESIGN BY SVAPPS