JOB DESCRIPTION:
Business support through data analysis and research. Modify pre-designed queries to assess data for research purposes. Independently run data statistics, including coverage stats, field populations, etc. Run data profiles. Support business or other data engineering teams with data subject matter expertise. Complete new data evaluations to diverse audiences. Create data documentation including mappings and quality thresholds. Investigate, track, and report data issues. Conduct data review and verification and ensures corrections/clarifications are made in a timely manner. Strong level of intimacy within a specific data content area. Provide technical problem solving to less technical teams (customer service, consumer advocacy, etc.). Undertake and complete new data onboarding and/or production data management functions as assigned/required. Utilize various data workflow management and analysis tools. Resolve and/or guide data sources in resolving basic data issues within specific functional area or data type. Produce basic ad hoc data counts, stats, examples as required. Perform other duties as needed.REQUIREMENTS:
Master’s degree (or foreign equivalent) in Computer Science, Data Science, Information Systems, or a related field required.2 years of experience in job offered or related occupations required.Also required is:2 years of experience: with relational database management systems (RDBMS) such as MySQL, PostgreSQL, or SQL Server, with extensive experience in writing complex SQL queries, optimizing query performance, and managing large-scale databases; analyzing structured and unstructured datasets to derive actionable business insights; with BI tools such as PowerBI, Tableau, or QlikView, creating interactive dashboards, conducting trend analysis, and presenting data in a visually compelling manner; data modeling principles, including the ability to design and implement conceptual, logical, and physical data models; normalizing data, ensuring referential integrity, and optimizing models for both transactional and analytical purposes; programming in Python or R, with a focus on automating data workflows, performing statistical analysis, and developing custom scripts for data processing; using libraries such as Pandas, NumPy, and Scikit-learn for data manipulation and machine learning; and working with large datasets, utilizing big data processing tools and technologies to efficiently analyze and manage data at scale.1 year of experience: with data integration tools (e.g., Talend, Informatica, Apache NiFi) and ETL processes, including the extraction, transformation, and loading of data across diverse data sources; designing and implementing robust ETL pipelines to ensure seamless data flow and transformation, adhering to best practices in data warehousing and data management; providing technical problem-solving support to less technical teams, such as customer service or consumer advocacy; investigating, tracking, and resolving data issues within specific functional areas or data types, ensuring data accuracy and integrity.Employee reports to LexisNexis Risk Solutions, Inc. office in Alpharetta, GA, but may telecommute from any location within the U.S.Experience can be concurrent.#LI-DNI
#ICT
LexisNexis, a division of RELX, is an equal opportunity employer: qualified applicants are considered for and treated during employment without regard to race, color, creed, religion, sex, national origin, citizenship status, disability status, protected veteran status, age, marital status, sexual orientation, gender identity, genetic information, or any other characteristic protected by law. We are committed to providing a fair and accessible hiring process. If you have a disability or other need that requires accommodation or adjustment, please let us know by completing our Applicant Request Support Form: https://forms.office.com/r/eVgFxjLmAK , or please contact 1-855-833-5120.
Please read our Candidate Privacy Policy.