Quality Assurance Analyst III RealMed
THIS JOB HAS EXPIRED Availity delivers revenue cycle and related business solutions for health care professionals who want to build healthy, thriving organizations. Availity has the powerful tools, actionable insights and expansive network reach that medical businesses need to get an edge in an industry constantly redefined by change.
As a Quality Assurance Analyst, you?re responsible for continuously improving the quality of the company?s product line by preventing, quickly detecting, and validating product and process defects, to deliver market and customer driven business results.
You will participate in the core development process as both a consumer and a supplier of artifacts to ensure the delivery of quality software applications.
You will evaluate the test results, prepare change requests and generate measures to assess the product quality. Measurement will include code and requirement coverage, cost/coverage tradeoffs, and defect management to quantify the progress and identify risks with the associated test execution statistics.
In this role you will work with an integrated team of developers, product managers, product owners, and system analysts for test planning, test driven development, and product validation (white and black box testing, regression validation, and user acceptance story tests).
Provide Production and Project Support
Analyze root causes of quality deficiencies and ensures proper corrective actions are implemented in a timely manner.
Prepare QA Guidance and Standards
Follow company?s software quality assurance guidelines.
Lead the effort to define, implement measure and optimize, and audit our product development quality control standards and processes.
Assist in the on-going evolution of company?s software quality assurance best practices.
Stays abreast of healthcare industry and QA discipline via self-study and training.
Create/update Test Plan/Strategy (or revise Scope and Schedule to existing Strategy)
Document risks and assist Scrum Master with mitigation approaches
Conduct project team reviews
Utilize ATDD techniques to create a robust/reliable /repeatable automation test bed.
Coordinate Performance testing with Test Services
Provide sign-offs for project deliverables as the QA representative (when required)
Create/update Test Environment Plan (if new or changed)
Create/update Test Execution Plan, which combines the strategy and environment plan with
Test Cases (reported from test repository)
Test Data Plan
Set and monitor test execution rate times/schedules
Test Planning Status (Progress)
Project Issues (Defects)
Estimate (Sizing) ? lead estimation of testing
Smoke Test ? To validate all application and components integrate and communicate without significant failure.
Acceptance Testing - To exercise all customer paths / routes through the application, positive and negative cases, and all possible technical integration and failure points.
Manual Exploratory Testing -Ensures that Acceptance Criteria are met for new features through manual exploratory testing.
Incremental Integration Regression Product Testing - continuous testing of regressed iteration module (s) as new functionality is added. This occurs per iteration.
Support Production Release Validation Process
Release Regression ? To ensure the final release QA candidate (after fixes or modifications of the software) is not adversely impacted when integrated with existing modules. This occurs on a monthly basis.
UAT ? To validate that the system can solve the original business problem and/or requested features and can operate successfully within the end user community business processes.
Test Automation - use of special software (separate from the software being tested) to control the execution of tests, the comparison of actual outcomes to predicted outcomes, the setting up of test preconditions, and other test control and test reporting functions
Test Data Validation ? To determine if a set of test data or test cases is useful by deliberately introducing various code changes ('bugs') and retesting with the original test data/cases to determine if the 'bugs' are detected.
Daily Test Execution report (daily during testing) - Consists of Test Execution Highlights, Scenario Status, and Defect Summary
Project Issues (as discovered during testing)
Defect summary (weekly during testing, available on request)
Risk and Mitigation Plans
Final Testing Report ? statistical and risk information for go/no-go decision
WORK EXPERIENCE & SKILLS (Required)
Must have experience working with a test team in an agile development environment.
Must have 6 - 8 years QA experience in a high-tech environment.
Minimum 5 years of Web and Batch QA experience.
Highly proficient in testing tiered web based applications (ASP, .NET, Java, HTML, XML) utilizing a relational database back-end (Oracle and SQL)
Experience in testing Web Services (Soap UI) and .Net 2.0 and higher.
WORK EXPERIENCE & SKILLS (Preferred)
Experience with configuration management and deployment activities.
Experience with billing systems
Unit testing support experience
Release Engineering experience
Minimum of 2 years? experience with Selenium, IBM Rational Functional Tester, HP Quick Test or comparable.
CSTE and/or CSQA certifications
Minimum 5 years of healthcare experience.
EDUCATION AND CERTIFICATION (Required)
Bachelor?s degree in Engineering, Computer Science or a related discipline.
NOTE: 2- 4 years of relevant work experience may serve as an equivalent for the bachelor?s degree.
||Indianapolis, IN |
THIS JOB HAS EXPIRED