Software Engineer - Big Data 24/7 Customer
THIS JOB HAS EXPIRED Are you passionate about building platforms that use machine learning to turn massive amounts of data into experiences that anticipate user?s needs so well they say ?how did they do that?? In the role of senior software engineer, you will work on building a highly-scalable and extensible big data platform that provides the foundation for collecting, storing, modeling, and analyzing massive data sets from multiple channels, including the web, IVR, mobile, social media and live agent call and chat interactions.
Responsibilities include the design and development of various components of the big data platform, including the Large Data Store, Processing Engine, Job Control System, Workflow Processor, Mapping and Model Training Framework, and Reporting Aggregate Processor. Successful candidates will have a computer science or related degree, expert level knowledge in Java/OO technologies, and 6+ years? experience engineering commercial software platforms and large scale data infrastructures. The ideal candidate will possess a high level of technical creativity, problem solving ability, software process and lifecycle expertise, communication skills, and an ability to work independently and as part of a team.
The position is based in Campbell or Alameda, CA.
Design and develop new components of the big data platform
Own the platform development process from inception to production deployment, including project scheduling, design, implementation and coordination with other team members
Work closely with Product Management and Data Scientist teams to define and refine the big data platform to achieve company product and business objectives
Collaborate with other technology teams and architects to define and develop cross- function technology stack interactions
Research and experiment with emerging technologies and tools related to big data
Contribute to the big data open source ecosystem
Work with the Engineering management team to establish and reinforce disciplined software development processes and best-practices
Bachelor?s or Master?s degree in Computer Science, Electrical Engineering, or equivalent
Experience using big data platforms and tools such as: Hadoop, Hive, Cassandra, HBase, Pig, Flume, Sqoop, Oozie
Experience in scaling applications on big data platforms to massive size.
Mastery in Java/C++ programming
Proficient in SQL/RDBM
Strong skills for designing distributed systems
Demonstrated experience transitioning systems to Operations
6+ years of commercial software development experience
5+ years of experience on data warehouse with a strong understanding of the technology stack
2+ years Hadoop experience
Working knowledge of machine learning, rule-based decision making, information-parsing and complex data mining
Development lead experience in a SaaS or packaged software company
Experience creating, managing and administering Hadoop, NoSQL storage systems
Knowledge of statistical methods and data modeling
||910 East Hamilton Avenue |
Campbell, CA 95008
THIS JOB HAS EXPIRED