Data Scientist
Data Scientist Job 15 miles from Pelham
**Details** Kemper is one of the nation's leading specialized insurers. Our success is a direct reflection of the talented and diverse people who make a positive difference in the lives of our customers every day. We believe a high-performing culture, valuable opportunities for personal development and professional challenge, and a healthy work-life balance can be highly motivating and productive. Kemper's products and services are making a real difference to our customers, who have unique and evolving needs. By joining our team, you are helping to provide an experience to our stakeholders that delivers on our promises
Data Science is a driver of significant competitive advantage for Kemper and is critical to the organization's success. As a member of the Kemper Auto Data Science team, this position is responsible for building and developing analytical solutions, especially in updating existing solutions.
**Responsibilities:**
+ Independently designs and develops analytical solutions.
+ Develops and automates processes that can be deployed throughout the organization to solve reoccurring analytics needs.
+ Communicates project progress and challenges to the working team.
+ Develops database, technical, and business knowledge.
**Qualifications:**
+ Graduate degree in Mathematics, Statistics, Engineering, or other STEM field with 2-4 years of experience working in a data science/analytics environment.
+ PhD in Mathematics, Statistics, Engineering, or other STEM field with related industry experience preferred.
+ Proficient in Python programming language (scikit-learn, pandas, numpy, scipy, matplotlib, etc.).
+ Proficiency and experience with several modeling techniques such as generalized linear models, decision trees, ensemble learning, regularized models (ridge/lasso/nets), clustering, and neural networks.
+ Some experience with natural language processing is preferred.
+ Prior experience working with imbalanced datasets and AWS is a plus.
+ Ability to work with various data formats. This includes relational databases (especially), delimited text files, data frames, and JSONs.
+ Excellent overall communication skills, particularly possessing the ability to translate technical results for wide audiences.
The range for this position is $90300 to $150400. When determining candidate offers, we consider experience, skills, education, certifications, and geographic location among other factors. This job is eligible for an annual discretionary bonus and Kemper benefits (Medical, Dental, Vision, PTO, 401k, etc.)
Kemper is proud to be an equal opportunity employer. All applicants will be considered for employment without attention to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran, disability status or any other status protected by the laws or regulations in the locations where we operate. We are committed to supporting diversity and equality across our organization and we work diligently to maintain a workplace free from discrimination.
Kemper does not accept unsolicited resumes through or from search firms or staffing agencies. All unsolicited resumes will be considered the property of Kemper and Kemper will not be obligated to pay a placement fee.
Kemper will never request personal information, such as your social security number or banking information, via text or email. Additionally, Kemper does not use external messaging applications like WireApp or Skype to communicate with candidates. If you receive such a message, delete it.
\#LI-WH-1
**Kemper at a Glance**
The Kemper family of companies is one of the nation's leading specialized insurers. With approximately $13 billion in assets, Kemper is improving the world of insurance by providing affordable and easy-to-use personalized solutions to individuals, families and businesses through its Kemper Auto and Kemper Life brands. Kemper serves over 4.8 million policies, is represented by approximately 22,200 agents and brokers, and has approximately 7,500 associates dedicated to meeting the ever-changing needs of its customers. Learn more at Kemper.com .
*Alliance United Insurance Company is not rated.
_We value diversity and strive to be an employer of choice. An Equal Opportunity Employer, M/F/D/V_
**Our employees enjoy great benefits:**
- Qualify for your choice of health and dental plans within your first month.
- Save for your future with robust 401(k) match, Health Spending Accounts and various retirement plans.
- Learn and Grow with our Tuition Assistance Program, paid certifications and continuing education programs.
- Contribute to your community through United Way and volunteer programs.
- Balance your life with generous paid time off and business casual dress.
- Get employee discounts for shopping, dining and travel through Kemper Perks.
AI & GEN AI Data Scientist-Experienced Associate
Data Scientist Job 15 miles from Pelham
Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Associate At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals.
In data analysis at PwC, you will focus on utilising advanced analytical techniques to extract insights from large datasets and drive data-driven decision-making. You will leverage skills in data manipulation, visualisation, and statistical modelling to support clients in solving complex business problems.
Driven by curiosity, you are a reliable, contributing member of a team. In our fast-paced environment, you are expected to adapt to working with a variety of clients and team members, each presenting varying challenges and scope. Every experience is an opportunity to learn and grow. You are expected to take ownership and consistently deliver quality work that drives value for our clients and success as a team. As you navigate through the Firm, you build a brand for yourself, opening doors to more opportunities.
Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to:
* Apply a learning mindset and take ownership for your own development.
* Appreciate diverse perspectives, needs, and feelings of others.
* Adopt habits to sustain high performance and develop your potential.
* Actively listen, ask questions to check understanding, and clearly express ideas.
* Seek, reflect, act on, and give feedback.
* Gather information from a range of sources to analyse facts and discern patterns.
* Commit to understanding how the business works and building commercial awareness.
* Learn and apply professional and technical standards (e.g. refer to specific PwC tax and audit guidance), uphold the Firm's code of conduct and independence requirements.
Minimum Degree Required
Bachelor's Degree
Minimum Year(s) of Experience
1 year(s)
Demonstrates thorough-level abilities and/or a proven record of success managing the identification and addressing of client needs:
* Building of GenAI and AI solutions, including but not limited to analytical model development and implementation, prompt engineering, general all-purpose programming (e.g., Python), testing, communication of results, front end and back-end integration, and iterative development with clients
* Documenting and analyzing business processes for AI and Generative AI opportunities, including gathering of requirements, creation of initial hypotheses, and development of GenAI and AI solution approach
* Collaborating with client team to understand their business problem and select the appropriate analytical models and approaches for AI and GenAI use cases
* Designing and solutioning AI/GenAI architectures for clients, specifically for plugin-based solutions (i.e., ChatClient application with plugins) and custom AI/GenAI application builds
* Processing unstructured and structured data to be consumed as context for LLMs, including but not limited to embedding of large text corpus, generative development of SQL queries, building connectors to structured databases
* Support management of daily operations of a global data and analytics team on client engagements, review developed models, provide feedback and assist in analysis;
* Directing data engineers and other data scientists to deliver efficient solutions to meet client requirements;
* Leading and contributing to development of proof of concepts, pilots, and production use cases for clients while working in cross-functional teams;
* Structuring, write, communicate and facilitate client presentations; and,
* Directing associates through coaching, providing feedback, and guiding work performance.
Demonstrates thorough abilities and/or a proven record of success learning and performing in functional and technical capacities, including the following areas:
* Managing AI/GenAI application development teams including back-end and front-end integrations
* Using Python (e.g., Pandas, NLTK, Scikit-learn, Keras etc.), common LLM development frameworks (e.g., Langchain, Semantic Kernel), Relational storage (SQL), Non-relational storage (NoSQL);
* Experience in analytical techniques such as Machine Learning, Deep Learning and Optimization
* Vectorization and embedding, prompt engineering, RAG (retrieval, augmented, generation) workflow dev
* Understanding or hands on experience with Azure, AWS, and / or Google Cloud platforms
* Experience with Git Version Control, Unit/Integration/End-to-End Testing, CI/CD, release management, etc.
Travel Requirements
Up to 80%
Job Posting End Date
Learn more about how we work: **************************
PwC does not intend to hire experienced or entry level job seekers who will need, now or in the future, PwC sponsorship through the H-1B lottery, except as set forth within the following policy: ***********************************
As PwC is an equal opportunity employer, all qualified applicants will receive consideration for employment at PwC without regard to race; color; religion; national origin; sex (including pregnancy, sexual orientation, and gender identity); age; disability; genetic information (including family medical history); veteran, marital, or citizenship status; or, any other status protected by law.
For only those qualified applicants that are impacted by the Los Angeles County Fair Chance Ordinance for Employers, the Los Angeles' Fair Chance Initiative for Hiring Ordinance, the San Francisco Fair Chance Ordinance, San Diego County Fair Chance Ordinance, and the California Fair Chance Act, where applicable, arrest or conviction records will be considered for Employment in accordance with these laws. At PwC, we recognize that conviction records may have a direct, adverse, and negative relationship to responsibilities such as accessing sensitive company or customer information, handling proprietary assets, or collaborating closely with team members. We evaluate these factors thoughtfully to establish a secure and trusted workplace for all.
Applications will be accepted until the position is filled or the posting is removed, unless otherwise set forth on the following webpage. Please visit this link for information about anticipated application deadlines: ***************************************
The salary range for this position is: $75,000 - $118,000, plus individuals may be eligible for an annual discretionary bonus. For roles that are based in Maryland, this is the listed salary range for this position. Actual compensation within the range will be dependent upon the individual's skills, experience, qualifications and location, and applicable employment laws. PwC offers a wide range of benefits, including medical, dental, vision, 401k, holiday pay, vacation, personal and family sick leave, and more. To view our benefits at a glance, please visit the following link: ***********************************
Lead Data Scientist
Data Scientist Job 15 miles from Pelham
Do you have a strong background in machine learning and deep learning? Are you interested in utilizing your data science skills and working with a small team in a fast-paced environment to achieve strategic mission goals? If so, Deloitte has an exciting opportunity for you!
The Team:
The GPS GSi group at Deloitte is dedicated to driving innovation and efficiency through advanced data science and business intelligence solutions. Our team collaborates closely with Enabling Area professionals to develop and implement cutting-edge machine learning, deep learning, and generative AI initiatives. We thrive in a dynamic environment where teamwork and independent problem-solving are key to achieving our strategic mission goals. Join us to be part of a small, agile team that is at the forefront of transforming decision-making processes and delivering impactful solutions.
Recruiting for this role ends on July 31st, 2025.
What You'll Do:
As a member of our GPS GSi group, you will play a crucial role in the development and maintenance of our data science and business intelligence solutions. This role will specialize in assisting with machine learning, deep learning, and generative AI initiatives that will be utilized by Enabling Area professionals to enhance and expedite decision-making. You will provide expertise within and across business teams, demonstrate the ability to work independently, and apply problem-solving skills to resolve complex issues.
* Lead and participate in developing data science products, transforming client needs into quantifiable solutions.
* Independently carry out tasks, using critical thinking and problem-solving skills to devise effective solutions.
* Design, train, and deploy machine learning and deep learning models on platforms like AWS, Databricks, and Dataiku.
* Develop and advise on Large Language Model (LLM) solutions for enterprise-wide documentation, including RAG, CPT, and SFT.
* Utilize MLOps pipelines, including containerization (Docker) and CI/CD, for training and deploying models.
* Write clean, well-commented code for easy collaboration and maintain structured project documentation using GitHub and/or Jira.
Qualifications
Required skills:
* Bachelor's Degree in Statistics, Mathematics, Computer Science, Engineering, or another analytical field.
* 6+ years of experience in data science, with deep knowledge of Python, machine learning, deep learning, and related packages (e.g., sklearn, TensorFlow, PyTorch). Proven ability to lead data science projects from inception to deployment.
* Strong knowledge of Large Language Models (LLMs) and Retrieval-Augmented Generation (RAG).
* Familiarity with AWS, Databricks, and/or Dataiku platforms.
* Working knowledge of MLOps, including containerization (e.g., Docker).
* Strong organizational skills, with clear project documentation and the ability to write clean code.
* Familiarity with agile project methodology and project development lifecycle.
* Experience with GitHub for version control.
* Ability to manage multiple detailed tasks and responsibilities simultaneously, meeting deadlines and objectives accurately.
* Must be legally authorized to work in the United States without employer sponsorship, now or in the future.
* Ability to travel 10%-25%, on average, based on the work you do and the clients and industries/sectors you serve
Preferred Skills:
* Master's Degree in Statistics, Mathematics, Computer Science, Engineering, or another analytical field, or equivalent direct work experience.
* Significant experience with MLOps and associated serving frameworks (e.g., Flask, FastAPI) and orchestration pipelines (e.g., SageMaker Pipelines, Step Functions, Metaflow).
* Significant experience working with open-source LLMs, including serving via TGI/vLLM and performing Continued Pre-training (CPT) and/or Supervised Fine-tuning (SFT).
* Experience using various AWS Services (e.g., Textract, Transcribe, Lambda, etc.).
* Proficiency in basic front-end web development (e.g., Streamlit).
* Knowledge of Object-Oriented Programming (OOP) concepts.
The wage range for this role takes into account the wide range of factors that are considered in making compensation decisions including but not limited to skill sets; experience and training; licensure and certifications; and other business and organizational needs. The disclosed range estimate has not been adjusted for the applicable geographic differential associated with the location at which the position may be filled. At Deloitte, it is not typical for an individual to be hired at or near the top of the range for their role and compensation decisions are dependent on the facts and circumstances of each case. A reasonable estimate of the current range is $97,600 to $179,900
You may also be eligible to participate in a discretionary annual incentive program, subject to the rules governing the program, whereby an award, if any, depends on various factors, including, without limitation, individual and organizational performance.
Information for applicants with a need for accommodation: ************************************************************************************************************
EA_ExpHire, EA_GPS_ExpHire, #LI-JK2
Recruiting tips
From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters.
Benefits
At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you.
Our people and culture
Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work.
Our purpose
Deloitte's purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Learn more.
Professional development
From entry-level employees to senior leaders, we believe there's always room to learn. We offer opportunities to build new skills, take on leadership opportunities and connect and grow through mentorship. From on-the-job learning experiences to formal development programs, our professionals have a variety of opportunities to continue to grow throughout their career.
As used in this posting, "Deloitte" means Deloitte Services LP, a subsidiary of Deloitte LLP. Please see ************************* for a detailed description of the legal structure of Deloitte LLP and its subsidiaries.
All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, age, disability or protected veteran status, or any other legally protected basis, in accordance with applicable law.
Requisition code: 302514
Job ID 302514
Data Scientist Lead
Data Scientist Job 34 miles from Pelham
Manages and builds systems of models to analyze diverse big data sources to generate insights and solutions for business partners and product enhancement. Manages and participates in developing, testing and validating models that drive business value. Assists with identifying and interpreting insights from data. Direct leadership of assigned data science team.
**POSITION RESPONSIBILITIES:**
Manage and participate in working with large data sets to solve unstructured problems using different analytical and statistical approaches for a single domain.
Manage and participate in sourcing, ingesting, and cleaning of data sets in preparation for analysis, work with data teams to productionize and scale data cleanup process. Ensure data is stable, accounting for complex data drift in development and production.
Manage and participate in the building of econometric, statistical and machine learning models for various problems inclusive of classification, clustering, pattern analysis, sampling, and simulations.
Manage committing of complex coding into model repository to serve as a source for others and promote complex models into production system.
Manage and develop champion/challenger models and adjust models accordingly.
Lead the selection and refinement of models taking into account performance, reliability and stability metrics and business feedback.
Develop model refinement educational materials and deliver related training for data users.
Guide less experienced data scientists on model development, selection, refinement, measurements, and visualizations and creating consumable model outputs.
Create outputs from multiple models for business discussions to display model outcomes, impact and business value.
Lead stakeholder meetings to discuss concerns, opportunities and production challenges.
Work with more experienced data scientists to develop new research approaches, provide recommendations on how techniques will be adapted based on client needs, and attend meetings with data clients to understand their research questions.
Review own and assigned team's code to ensure it is efficient, accurate, and using best practices.
Exercise usual authority of a manager concerning staffing, performance appraisals, promotions, salary recommendations, performance management and terminations.
Understand and adhere to the Company's risk and regulatory standards, policies and controls in accordance with the Company's Risk Appetite. Design, implement, maintain and enhance internal controls to mitigate risk on an ongoing basis. Identify risk-related issues needing escalation to management.
Maintain M&T internal control standards, including timely implementation of internal and external audit points together with any issues raised by external regulators as applicable.
Complete other related duties as assigned.
**Specific to Posting:**
+ Perform complex analysis and judgment based work to support the identification and quantification of compliance risk.
+ Utilize a data driven approach to assess the effectiveness of risk controls, identifying exceptions and investigating root cause.
+ Execute independent assignments within defined timelines with attention to detail and an investigative, quality-first mindset
**MANAGERIAL/SUPERVISORY RESPONSIBILITY:**
Indirect supervision - staff of 6-8
**MINIMUM QUALIFICATIONS REQUIRED:**
Bachelor's degree and a minimum of 7 years related experience, or in lieu of a degree, a combined minimum of 11 years higher education and/ or work experience, including a minimum of 7 years related experience
Intermediate experience working with multiple statistics and following data science principles such as AB testing, sample selection, hypothesis testing, and modeling bias
Intermediate proficiency with pertinent statistical software and languages and tools
Experience with various hybrid databases both on premise and in the cloud
Intermediate level knowledge of python
Expert understanding of modeling techniques such as Bayesian modeling, Classification models, Cluster analysis, Neural Network, Non-parametric methods, and Multivariate statistics
Experience analyzing large data sets
**IDEAL QUALIFICATIONS PREFERRED:**
Masters' of Science or Doctorate degree in Statistics, Economics, Finance or related field in the quantitative social, physical or engineering sciences, with proven coursework proficiency in statistics, econometrics, economics, computer science, finance or risk management
Fluent in econometric/statistical techniques, including time-series analysis, predictive modeling, segmentation, NLP, inferential statistics, panel data methods and logistic regression
Tactical experience with pertinent statistical software and languages and tools with 5-7 years experience with python
Experience with data visualization and business intelligence creating insightful dashboards in tools such as Power BI, Tableau or Looker
Knowledge of the banking industry, particularly key banking regulations (i.e. UDAAP, Fair Lending, SCRA, Privacy, FCRA, etc.)
Exposure to a variety of database types including SQL Server, Teradata and Snowflake
M&T Bank is committed to fair, competitive, and market-informed pay for our employees. The pay range for this position is $115,703.73 - $192,839.55 Annual (USD). The successful candidate's particular combination of knowledge, skills, and experience will inform their specific compensation. The range listed above corresponds to our national pay range for this role. The specific pay range applicable to you may vary based on your location.
**Location**
Clanton, Alabama, United States of America
M&T Bank Corporation is an Equal Opportunity/Affirmative Action Employer, including disabilities and veterans.
Risk Data Scientist
Data Scientist Job 15 miles from Pelham
Thank you for your interest in a career at Regions. At Regions, we believe associates deserve more than just a job. We believe in offering performance-driven individuals a place where they can build a career --- a place to expect more opportunities. If you are focused on results, dedicated to quality, strength and integrity, and possess the drive to succeed, then we are your employer of choice.
Regions is dedicated to taking appropriate steps to safeguard and protect private and personally identifiable information you submit. The information that you submit will be collected and reviewed by associates, consultants, and vendors of Regions in order to evaluate your qualifications and experience for job opportunities and will not be used for marketing purposes, sold, or shared outside of Regions unless required by law. Such information will be stored in accordance with regulatory requirements and in conjunction with Regions' Retention Schedule for a minimum of three years. You may review, modify, or update your information by visiting and logging into the careers section of the system.
**Job Description:**
At Regions, the Risk Data Scientist researches, models, implements, and validates algorithms (predictive and prescriptive) to analyze diverse sources of data to achieve targeted outcomes.
The position at this level works with multiple teams of data scientists, analysts, and visualization experts contributing independently to solve business problems with high complexity and enable effective risk management. Additionally, the position at this level requires in-depth knowledge in quantitative analytical methods, data management, visualization, and programming skills suitable to drive data-driven decisions.
**Primary Responsibilities**
+ Works with large, structured, and un-structured datasets
+ Uses quantitative and analytical techniques to accelerate profitable growth and monitor and mitigate risk - unlocking value across all functional areas of business
+ Uses Big Data tools (e.g. Hadoop, Spark, H2O, CDSW, Domino Labs, etc.) to build data analytics solutions
+ Builds machine learning and Artificial Intelligence (AI) models from development through testing and validation
+ Designs rich data visualizations to communicate complex ideas to business leaders and executives
+ Communicates outcomes and proposed business solutions to senior Risk Data Scientists
+ Draws insights from data to make quick, well informed decisions with available information
+ Demonstrates ability to continuously learn and provide value in a dynamic environment
+ Understands all phases of the model lifecycle, ensuring that models and associated documentation comply with model validation expectations
This position is exempt from timekeeping requirements under the Fair Labor Standards Act and is not eligible for overtime pay.
**Requirements**
+ Bachelor's degree and six (6) years of related experience
+ Or Master's degree and four (4) years of related experience
+ Or Ph.D. and two (2) years of related experience in a quantitative/analytical/STEM field
+ One (1) year of hands-on experience with Big Data technologies such as Hadoop, Hive, Impala, Spark, or Kafka
+ Two (2) years of working experience with statistical and predictive modeling concepts and approaches such as machine learning, clustering and classification techniques, and artificial intelligence
+ Two (2) years of working programming experience analyzing large, complex, and multi-dimensional datasets using a variety of tools such as SAS, Python, Ruby, R, Matlab, Scala, or Java
**Preferences**
+ Background in banking and/or other financial services
+ Experience in Agile Software Development
+ May require experience in libraries such as TensorFlow, Pytorch, or Keras
+ Knowledge in Google Analytics and/or Adobe Digital
**Skills and Competencies**
+ Advanced Structured Query Language (SQL) skills
+ Comfortable with both relational databases and Hadoop-based data mining frameworks
+ Deep understanding of statistical and predictive modeling concepts, machine learning approaches, clustering and classification techniques, and recommendation and optimization algorithms
+ Expertise in analyzing large, complex, multi-dimensional datasets
+ Proficient in visualization tools like Power Business Intelligence (BI) and Tableau
+ Strong business acumen with the ability to communicate with both business and Information Technology (IT) leaders
+ Strong communication skills through data visualizations as well as written and oral presentations
**Position Type**
Full time
**Compensation Details**
Pay ranges are job specific and are provided as a point-of-market reference for compensation decisions. Other factors which directly impact pay for individual associates include: experience, skills, knowledge, contribution, job location and, most importantly, performance in the job role. As these factors vary by individuals, pay will also vary among individual associates within the same job.
The target information listed below is based on the Metropolitan Statistical Area Market Range for where the position is located and level of the position.
**Job Range Target:**
**_Minimum:_**
$113,021.87 USD
**_Median:_**
$158,829.90 USD
**Incentive Pay Plans:**
**Benefits Information**
Regions offers a benefits package that is flexible, comprehensive and recognizes that "one size does not fit all" for benefits-eligible associates. (******************************************************************** Listed below is a synopsis of the benefits offered by Regions for informational purposes, which is not intended to be a complete summary of plan terms and conditions.
+ Paid Vacation/Sick Time
+ 401K with Company Match
+ Medical, Dental and Vision Benefits
+ Disability Benefits
+ Health Savings Account
+ Flexible Spending Account
+ Life Insurance
+ Parental Leave
+ Employee Assistance Program
+ Associate Volunteer Program
Please note, benefits and plans may be changed, amended, or terminated with respect to all or any class of associate at any time. To learn more about Regions' benefits, please click or copy the link below to your browser.
***********************************************
**Location Details**
Regions Center
**Location:**
Birmingham, Alabama
Equal Opportunity Employer/including Disabled/Veterans
Job applications at Regions are accepted electronically through our career site for a minimum of five business days from the date of posting. Job postings for higher-volume positions may remain active for longer than the minimum period due to business need and may be closed at any time thereafter at the discretion of the company.
Data Scientist
Data Scientist Job 15 miles from Pelham
**Advance Local** has an exciting opportunity to join our Data team as a **Data Scientist** . The **Data Scientist** applies data mining techniques, conducts statistical analysis, and builds high quality prediction systems integrated with our products and the customer data platform to increase consumer and advertising revenue. This position uses large data sets to find opportunities for product, marketing, UX and process optimization and builds models to enable advanced hypotheses testing of different courses of action to improve and increase business performance.
This position pays between $120,000 and $140,000 annually.
**What you'll be doing:**
+ Drive business results with data-based models, analysis and insights, working with stakeholders from departments including consumer revenue, product, content, advertising and analytics to improve business outcomes using and developing a new integrated data platform as well as the customer data platform and using tools and techniques such as A/B testing and other capabilities/tools.
+ Build and implement models, use and create algorithms to apply to data sets and run simulations.
+ Coordinate with different functional teams to implement models and monitor outcomes and campaign effectiveness.
+ Mine and analyze data from company databases to drive optimization and improvement of marketing techniques, business strategies and sales tactics.
+ Assess the effectiveness and accuracy of new data sources and data gathering techniques.
+ Use predictive modeling to increase and optimize customer experiences, revenue generation, sales targeting and other business outcomes.
+ Develop company A/B testing framework and test model quality.
+ Develop processes and tools to monitor and analyze model performance and data accuracy.
+ Develop materials - presentations, communications, reports, etc. to explain findings.
**Our ideal candidate will have the following:**
+ Bachelor's degree in statistics, mathematics, computer science or another quantitative field required; master's degree preferred
+ Minimum three years' experience manipulating data sets and building statistical models
+ Strong problem-solving skills with an emphasis on using disparate data sources
+ Experience using statistics, propensity model building, and algorithms to manipulate data and draw insights from large data sets
+ Experience working with and creating data sets
+ Experience performing A/B testing and reporting results
+ Knowledge of a variety of machine learning techniques and their real-world advantages/drawbacks to recommend business-growth campaigns
**Additional Information**
Advance Local Media offers competitive pay and a comprehensive benefits package with affordable options for your healthcare including medical, dental and vision plans, mental health support options, flexible spending accounts, fertility assistance, a competitive 401(k) plan to help plan for your future, generous paid time off, paid parental and caregiver leave and an employee assistance program to support your work/life balance, optional legal assistance, life insurance options, as well as flexible holidays to honor cultural diversity.
Advance Local Media is one of the largest media groups in the United States, which operates the leading news and information companies in more than 20 cities, reaching 52+ million people monthly with our quality, real-time journalism and community engagement. Our company is built upon the values of Integrity, Customer-first, Inclusiveness, Collaboration and Forward-looking. For more information about Advance Local, please visit *********************
Advance Local Media includes MLive Media Group, Advance Ohio, Alabama Media Group, NJ Advance Media, Advance Media NY, MassLive Media, Oregonian Media Group, Staten Island Media Group, PA Media Group, ZeroSum, Headline Group, Adpearance, Advance Aviation, Advance Healthcare, Advance Education, Advance National Solutions, Advance Recruitment, Advance Travel & Tourism, Cloud Theory, Hoot Interactive, Red Clay Media, Search Optics, Subtext.
Advance Local Media is proud to be an equal opportunity employer, encouraging applications from people of all backgrounds. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, genetic information, national origin, age, disability, sexual orientation, marital status, veteran status, or any other category protected under federal, state or local law.
If you need a reasonable accommodation because of a disability for any part of the employment process, please contact Human Resources and let us know the nature of your request and your contact information.
Advance Local Media does not provide sponsorship for work visas or employment authorization in the United States. Only candidates who are legally authorized to work in the U.S. will be considered for this position.
Senior Data Scientist - Algorithm Architect
Data Scientist Job 15 miles from Pelham
Description & Requirements We now have an exciting opportunity for a Senior Data Scientist specializing in algorithm architectures to join the Maximus AI Accelerator supporting the enterprise at large. We are looking for an accomplished hands-on individual contributor and team player to be a part of the AI Accelerator team.
You will be responsible for architecting and optimizing scalable, secure AI systems and integrating AI models in production using MLOps best practices, ensuring systems are resilient, compliant, and efficient. Additionally, you will be responsible for supporting the development of new and novel algorithms that ensure the responsible deployment of AI across the enterprise and support the development of data driven decision making methodologies. This role requires strong systems thinking, problem-solving abilities, mathematics, and the capacity to manage risk and change in complex environments. Success depends on cross-functional collaboration, strategic communication, and adaptability in fast-paced, evolving technology landscapes.
This position will be focused on strategic company-wide initiatives but will play a role in project delivery and capture solutioning (i.e., leaning in on existing or future projects and providing solutioning to capture new work.)
Essential Duties and Responsibilities:
- Develop, collaborate, and advance the applied and responsible use of AI, ML, mathematics, and data science solutions throughout the organization by finding the right fit of tools, technologies, methodologies, processes, and automation to enable effective and efficient solutions for each unique situation.
- Contribute and lead efforts across the organization to support the creation of solutions and real mission outcomes leveraging AI capabilities from Computer Vision, Natural Language Processing, Large Language Models (LLMs) and classical machine learning.
- Develop mathematically rigorous process improvement procedures.
- Maintain advanced, current knowledge of the AI technology landscape and emerging developments, evaluating their applicability for use in production/operational environments.
- Contribute and lead the creation, curation, and promotion of playbooks, best practices, lessons learned and firm intellectual capital.
What You Will Do:
- Develop, collaborate, and advance the applied and responsible use of AI, ML, mathematics, and data science solutions throughout the organization by finding the right fit of tools, technologies, methodologies, processes, and automation to enable effective and efficient solutions for each unique situation.
- Contribute and lead efforts across the organization to support the creation of solutions and real mission outcomes leveraging AI capabilities from Computer Vision, Natural Language Processing, Large Language Models (LLMs) and classical machine learning.
- Develop mathematically rigorous process improvement procedures.
- Maintain advanced, current knowledge of the AI technology landscape and emerging developments, evaluating their applicability for use in production/operational environments.
- Contribute and lead the creation, curation, and promotion of playbooks, best practices, lessons learned and firm intellectual capital.
Minimum Requirements (required skills that align with contract LCAT, verifiable, and measurable):
- Professional Programming experience (e.g. Python, R, etc.)
- Experience in two of the following: Computer Vision, Natural Language Processing, Deep Learning, and/or Classical ML.
- Experience with API programming
- Experience with Linux
- Experience with Statistics
- Experience with Linear Algebra and Calculus
- Experience applying mathematical principles to real world problems
- Experience with Machine Learning
- Experience working as a contributor on a team
Minimum Education requirement:
- BS in a quantitative discipline (e.g. Math, Physics, Engineering, Economics, Computer Science, etc.)
Years of Required Work-Related Experience:
- 3+ yrs experience in Artificial Intelligence and Machine Learning
- 3+ yrs experience in Software Development
- 3+ yrs of Mathematical Modeling
Required Certifications:
- None
Minimum Requirements
- Bachelor's degree in relevant field of study and 5+ years of relevant professional experience required, or equivalent combination of education and experience.
Preferred Key Skills and Abilities (not contractually required):
- Masters in quantitative discipline
- Experience developing machine learning or signal processing algorithms:
- Ability to leverage mathematical principles to model new and novel behaviors.
- Ability to leverage statistics to identify true signals from noise or clutter
- Experience working as an individual contributor in AI
- Use of state-of-the-art technology to solve operational problems in AI and Machine Learning.
- Strong knowledge of data structures, common computing infrastructures/paradigms (stand alone and cloud), and software engineering principles
- Ability to design custom solutions in the AI and Advanced Analytics sphere for customers. This includes the ability to scope customer needs, identify currently existing technologies, and develop custom software solutions to fill any gaps in available off the shelf solutions.
- Ability to build reference implementations of operational AI & Advanced Analytics processing solutions.
- Use of a variety of programming languages, including but not limited to Python/Java and frontend frameworks for POC demos and dashboarding
- Use and development of program automation, CI/CD, DevSecOps, and Agile
- Experience with deep learning model architecture development and philosophy
- Cloud certifications (AWS, Azure, or GCP)
- 5+ yrs of related experience in AI, advanced analytics, computer science, mathematical modeling, or software development.
- Python Experience with TensorFlow, PyTorch, and Pandas
EEO Statement
Maximus is an equal opportunity employer. We evaluate qualified applicants without regard to race, color, religion, sex, age, national origin, disability, veteran status, genetic information and other legally protected characteristics.
Pay Transparency
Maximus compensation is based on various factors including but not limited to job location, a candidate's education, training, experience, expected quality and quantity of work, required travel (if any), external market and internal value analysis including seniority and merit systems, as well as internal pay alignment. Annual salary is just one component of Maximus's total compensation package. Other rewards may include short- and long-term incentives as well as program-specific awards. Additionally, Maximus provides a variety of benefits to employees, including health insurance coverage, life and disability insurance, a retirement savings plan, paid holidays and paid time off. Compensation ranges may differ based on contract value but will be commensurate with job duties and relevant work experience. An applicant's salary history will not be used in determining compensation. Maximus will comply with regulatory minimum wage rates and exempt salary thresholds in all instances.
Minimum Salary
$
123,440.00
Maximum Salary
$
180,000.00
Senior Data Scientist, Business Strategy-Revenue Cycle Section
Data Scientist Job 15 miles from Pelham
Schedule: Monday-Friday 8am-5pm Benefits include: 100% tuition assistance, wellness initiatives, generous paid time off, paid parental leave, Public Service Loan Forgiveness Program eligible employer, plus more. In addition to our many benefits and perks, UAB Medicine provides a variety of resources to support employees both personally and professionally.
To provide proven expertise in both frontend and backend data science development, in Pytho or related statistical languages. To develop innovative computational & data science methods and provide solutions that leverage a variety of techniques, including mathematical theories and engineering principles. To collaborate within the organization, provide data story-telling and strategic consultation, and to help understand and address the unique and complex analytics needs of various departments within UAB. To work closely with internal partners to understand business needs, scope analytics efforts, embed analytics into decision-making processes and guide strategic decisions using data. To effectively
communicate analytic findings and recommendations to both technical and business leaders. To pioneer new approaches and technology for the organization, constantly looking for new ways to generate value, and adapting to meet the needs of internal stakeholders and initiatives.
Key Duties & Responsibilities:
1. Develop computational methods and effectively resolve research problems utilizing multiple scientific methodologies.
2. Partner with business partners across departments to address complex issues through evaluation and problem assessment.
3. Conduct advanced research in computational science, including developing new algorithms, models, or simulations to address complex scientific problems.
4. Enhance and expand algorithms and software to guarantee efficiency and provide consulting services regarding computer applications, providing technical support and expertise in computational modeling and simulation tools to other researchers and departments.
5. Perform computerized data processing, analysis, and reviews, and generate reports to present pertinent information to various groups.
6. Collaborates in problem assessment, analysis, and development of computational methods or procedures, independently evaluates, selects, and applies appropriate methods to the specific research problems.
7. Proposes solutions in engineering, the sciences, and other fields using mathematical theories and techniques.
8. Optimizes and extends algorithms, analysis pipelines, and software in order to establish and ensure effectiveness and scalability of computing infrastructure.
9. Provides consultation for and expertise with computer applications to complex research problems; performs computerized data processing operations and statistical analyses of research data; creates visualizations of results.
10. Performs code review, analysis review, and writes summary reports to collaborators.
11. Creates compelling presentations for senior audiences and presents them to senior stakeholders across the organization.
12. Engages in cross-functional project management, driving projects to completion in a timely manner.
13. Queries and analyzes data at scale to craft strategy and recommends for business partners.
14. Perform other duties as assigned.
Position Requirements:
Doctorate degree in a related field or Master's degree in a related field and two (2) years of related experience or Bachelor's degree in related field and four (4) years of related experience required. Work experience may NOT substitute for the education requirement.
Relevant degrees: Computer Science, Biosystem Engineering, Analytics, Economics, Statistics, or related fields.
LICENSE, CERTIFICATION AND/OR REGISTRATION:
Required: None
TRAITS & SKILLS:
Must be self-directed / self-motivated; must have excellent communication and possess outstanding customer service and interpersonal skills. Must be able to: (1) perform a variety of duties often changing from one task to another of a different nature without loss of efficiency or composure; (2) accept responsibility for one's own work; (3) work independently; (4) recognize the rights and responsibilities of patient confidentiality; (5) convey empathy and compassion to those experiencing pain, physical or emotional distress and/or grief; (6) relate to others in a manner which creates a sense of teamwork and cooperation; (7) communicate effectively with people from every socioeconomic, cultural and educational background; (8) exhibit flexibility and cope effectively in an ever-changing, fast-paced healthcare environment; (9) perform effectively when confronted with emergency, critical, unusual or dangerous situations; (10) demonstrate the quality work ethic of doing the right thing the right way; and (11) maintain a customer focus and strive to satisfy the customer's perceived needs.
UA Health Services Foundation (UAHSF) is proud to be an AA/EOE/M/F/Vet/Disabled employer.
Senior Data Scientist, Business Strategy-Revenue Cycle Section
Data Scientist Job 15 miles from Pelham
Schedule: Monday-Friday 8am-5pm To provide proven expertise in both frontend and backend data science development, in Pytho or related statistical languages. To develop innovative computational & data science methods and provide solutions that leverage a variety of techniques, including mathematical theories and engineering principles. To collaborate within the organization, provide data story-telling and strategic consultation, and to help understand and address the unique and complex analytics needs of various departments within UAB. To work closely with internal partners to understand business needs, scope analytics efforts, embed analytics into decision-making processes and guide strategic decisions using data. To effectively
communicate analytic findings and recommendations to both technical and business leaders. To pioneer new approaches and technology for the organization, constantly looking for new ways to generate value, and adapting to meet the needs of internal stakeholders and initiatives.
Key Duties & Responsibilities:
1. Develop computational methods and effectively resolve research problems utilizing multiple scientific methodologies.
2. Partner with business partners across departments to address complex issues through evaluation and problem assessment.
3. Conduct advanced research in computational science, including developing new algorithms, models, or simulations to address complex scientific problems.
4. Enhance and expand algorithms and software to guarantee efficiency and provide consulting services regarding computer applications, providing technical support and expertise in computational modeling and simulation tools to other researchers and departments.
5. Perform computerized data processing, analysis, and reviews, and generate reports to present pertinent information to various groups.
6. Collaborates in problem assessment, analysis, and development of computational methods or procedures, independently evaluates, selects, and applies appropriate methods to the specific research problems.
7. Proposes solutions in engineering, the sciences, and other fields using mathematical theories and techniques.
8. Optimizes and extends algorithms, analysis pipelines, and software in order to establish and ensure effectiveness and scalability of computing infrastructure.
9. Provides consultation for and expertise with computer applications to complex research problems; performs computerized data processing operations and statistical analyses of research data; creates visualizations of results.
10. Performs code review, analysis review, and writes summary reports to collaborators.
11. Creates compelling presentations for senior audiences and presents them to senior stakeholders across the organization.
12. Engages in cross-functional project management, driving projects to completion in a timely manner.
13. Queries and analyzes data at scale to craft strategy and recommends for business partners.
14. Perform other duties as assigned.
Position Requirements:
Doctorate degree in a related field or Master's degree in a related field and two (2) years of related experience or Bachelor's degree in related field and four (4) years of related experience required. Work experience may NOT substitute for the education requirement.
Relevant degrees: Computer Science, Biosystem Engineering, Analytics, Economics, Statistics, or related fields.
LICENSE, CERTIFICATION AND/OR REGISTRATION:
Required: None
TRAITS & SKILLS:
Must be self-directed / self-motivated; must have excellent communication and possess outstanding customer service and interpersonal skills. Must be able to: (1) perform a variety of duties often changing from one task to another of a different nature without loss of efficiency or composure; (2) accept responsibility for one's own work; (3) work independently; (4) recognize the rights and responsibilities of patient confidentiality; (5) convey empathy and compassion to those experiencing pain, physical or emotional distress and/or grief; (6) relate to others in a manner which creates a sense of teamwork and cooperation; (7) communicate effectively with people from every socioeconomic, cultural and educational background; (8) exhibit flexibility and cope effectively in an ever-changing, fast-paced healthcare environment; (9) perform effectively when confronted with emergency, critical, unusual or dangerous situations; (10) demonstrate the quality work ethic of doing the right thing the right way; and (11) maintain a customer focus and strive to satisfy the customer's perceived needs.
UA Health Services Foundation (UAHSF) is proud to be an AA/EOE/M/F/Vet/Disabled employer.
Data Engineer/Analyst - Business Process & Innovation
Data Scientist Job 15 miles from Pelham
The Data Engineer role will provide critical support to stakeholders of supported applications and to other members of the Business Process & Innovation team. This role will be responsible for developing data management and governance processes, creating and supporting automated data pipelines and executing data cleanup for applications supported by the team. This position requires a strong blend of collaboration, technical proficiency, and problem-solving abilities to ensure data accessibility, quality and reliability.
JOB RESPONSIBILITIES
Design, develop and maintain robust and scalable data pipelines to ingest, process and store large volumes of structured and unstructured data
Collaborate with cross-functional teams to understand data requirements and translate them into technical specifications
Implement and optimize ETL (Extract, Transform, Load) processes to ensure efficient data flow and transformation
Endure data quality and integrity by implementing data validation an cleansing processes
Monitor and troubleshoot data pipeline performance, identifying and resolving issues proactively
Develop and maintain data models and schemas to support analytics and reporting needs
Work with cloud-based and on-prem data platforms and tools to manage and process data efficiently
Stay up to date with emerging data engineering technologies and best practices to drive continuous improvement
Ensure compliance with data governance and security policies
JOB REQUIREMENTS
Education Requirements:
Bachelor's degree in computer science, Engineering, Information Technology or a related field. Advanced degrees are a plus.
Experience Requirements:
Experience in data engineering, software development, or a related field
Proven experience with data pipeline tools
Experience with SQL and database technologies
Experience with big data tools and technologies
Familiarity with cloud data platforms (AWS, Azure)
Knowledge, Skills & Abilities:
Proficiency in programming languages such as Python or Java
Strong understanding of data warehousing concepts and best practices
Experience with data modeling, architecture and integration
Excellent problem solving skills and attention to detail
Strong communication skills
Ability to work with a broad team
Knowledge of data security and privacy best practices
Familiarity with machine learning and AI tools is a plus
Behavioral Attributes:
Model “Our Values”: Safety First, Intentional Inclusion, Act with Integrity and Superior Performance
Passionate, demonstrable commitment to safety.
Values and takes personal initiative to support a diverse and inclusive workplace.
Champions teamwork and upholds the decisions of the team.
Maintains a customer-focused mindset and achieves excellence in customer satisfaction.
Unwavering adherence to the Southern Company Code of Ethics.
Demonstrates a commitment to the pursuit of continuous improvement.
Demonstrates a commitment to Innovation - identifies new opportunities to improve products & services then develops and implements plans to achieve results.
Exdata Engineer Senior
Data Scientist Job 15 miles from Pelham
Title - Exadata Administrator Senior Position Location: Strongsville, OH; Pittsburgh, PA; Birmingham, AL; Dallas,TX; Phoenix, AZ . Hybrid 3 days onsite Duration - 1 Year REQUIRED SKILLS: Exadata Administrator Requirements: Minimum 5 years of full stack Exadata support
Full stack patching of the Exadata appliance
Strong analytical and communication skills
Extensive experience troubleshooting performance issues
Understanding of the process for replacing failed components of the appliance
At least 5 years of Oracle DBA experience
Experience supporting Oracle Recovery Appliances
Please apply to the job if you are interested and have the required experience to ***************************
#L1 - RG1
#M1
Ref: #404-IT Pittsburgh
System One, and its subsidiaries including Joulé, ALTA IT Services, CM Access, TPGS, and MOUNTAIN, LTD., are leaders in delivering workforce solutions and integrated services across North America. We help clients get work done more efficiently and economically, without compromising quality. System One not only serves as a valued partner for our clients, but we offer eligible full-time employees health and welfare benefits coverage options including medical, dental, vision, spending accounts, life insurance, voluntary plans, as well as participation in a 401(k) plan.
System One is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender identity, age, national origin, disability, family care or medical leave status, genetic information, veteran status, marital status, or any other characteristic protected by applicable federal, state, or local law.
Data Engineer Manager
Data Scientist Job 15 miles from Pelham
Job Details
Travel Percentage
Negligible
Job Shift
Day
Description Purpose
The Data Engineering Manager brings strong expertise in ETL methodologies, data transformation, and cloud-based data platforms to lead a data engineering team in optimizing data handling and workflows. This roles focus on leveraging AI-driven automation techniques will enhance data processing efficiency over time. The ideal candidate for this position has a blend of hands-on technical ability, leadership skills specific to data engineering, and strategic vision for data processing.
Responsibilities
The responsibilities for this position include the following:
Provide general Leadership, Management, and Accountability (LMA) for ETL developers and data engineers, ensuring best practices in data processing.
Develop and maintain ETL pipelines for structured and unstructured data processing.
Implement AI-driven automation techniques to improve ETL efficiency over time.
Ensure data quality, security, and compliance in all processes and foster a culture of continuous improvement.
Establish and track KPIs for ETL performance, AI-driven automation, data integrity, and team productivity.
Qualifications Competencies and Qualities
Qualified candidates must have the following competencies and qualities:
Excellent problem-solving, project management, and stakeholder communication skills
Interest in AI/ML techniques for data processing, with a willingness to apply AI-driven automation over time
Able to lead a team to achieve high-level goals by initiating detailed, actionable plans
Attentive to the skill level and performance of direct reports
Resourceful and creative problem solver
Education Experience and Certifications Required
5+ years of experience in ETL development, data engineering, or a related field
3+ years of leadership experience in a software development or data engineering role
Experience working with unstructured data and data transformation techniques
Hands-on experience with ETL tools and data processing frameworks
Proficiency in Microsoft development tools, including Visual Studio, using C#
Experience with cloud-based data platforms (AWS Glue, Redshift, etc).
Strong knowledge of database systems, data lakes, and data warehouse solutions
Preferred
Hands-on experience or familiarity with AI/ML techniques for data transformation and anomaly detection
Experience with DevOps practices, CI/CD pipelines, and containerization (Docker, Kubernetes)
Experience with S3, EC2, Secrets Manager, CloudWatch, RDS
Experience building complicated Regular Expressions
Knowledge of data governance and regulatory compliance frameworks
Supervisory Responsibility
This position provides direct oversight and direction to a team of ETL developers/ data engineers, including assigning work, hiring for open positions, and conducting performance reviews.
Work Environment
This is a remote position, but there are regular meetings and critical team discussions at our client's main office in Irondale, Alabama.
Travel
This position requires little to no travel, but it may occasionally require trips to our client's office for team meetings.
Physical Demand
This role will require using a computer for long periods of time while either sitting or standing.
Position Type and Expected Hours
This is a full-time position for five, eight-hour days (at least 40 hours) per week.
Other Duties
Please note that this job description is not designed to cover or contain a comprehensive listing of activities, duties, or responsibilities. Activities, duties, and responsibilities may change at any time with or without notice.
Sr. Data Engineer
Data Scientist Job 15 miles from Pelham
Employment Type: Full-Time, Mid-level Department: Business Intelligence CGS is seeking a passionate and driven Data Engineer to support a rapidly growing Data Analytics and Business Intelligence platform focused on providing solutions that empower our federal customers with the tools and capabilities needed to turn data into actionable insights. The ideal candidate is a critical thinker and perpetual learner; excited to gain exposure and build skillsets across a range of technologies while solving some of our clients' toughest challenges.
CGS brings motivated, highly skilled, and creative people together to solve the government's most dynamic problems with cutting-edge technology. To carry out our mission, we are seeking candidates who are excited to contribute to government innovation, appreciate collaboration, and can anticipate the needs of others. Here at CGS, we offer an environment in which our employees feel supported, and we encourage professional growth through various learning opportunities.
Skills and attributes for success:
-Complete development efforts across data pipeline to store, manage, store, and provision to data consumers.
-Being an active and collaborating member of an Agile/Scrum team and following all Agile/Scrum best practices.
-Write code to ensure the performance and reliability of data extraction and processing.
-Support continuous process automation for data ingest.
-Achieve technical excellence by advocating for and adhering to lean-agile engineering principles and practices such as API-first design, simple design, continuous integration, version control, and automated testing.
-Work with program management and engineers to implement and document complex and evolving requirements.
-Help cultivate an environment that promotes customer service excellence, innovation, collaboration, and teamwork.
-Collaborate with others as part of a cross-functional team that includes user experience researchers and designers, product managers, engineers, and other functional specialists.
Qualifications:
-Must be a US Citizen.
-Must be able to obtain a Public Trust Clearance.
-7+ years of IT experience including experience in design, management, and solutioning of large, complex data sets and models.
-Experience with developing data pipelines from many sources from structured and unstructured data sets in a variety of formats.
-Proficiency in developing ETL processes, and performing test and validation steps.
-Proficiency to manipulate data (Python, R, SQL, SAS).
-Strong knowledge of big data analysis and storage tools and technologies.
-Strong understanding of the agile principles and ability to apply them.
-Strong understanding of the CI/CD pipelines and ability to apply them.
-Experience with relational database, such as, PostgreSQL.
-Work comfortably in version control systems, such as, Git Repositories.
Ideally, you will also have:
-Experience creating and consuming APIs.
-Experience with DHS and knowledge of DHS standards a plus.
-Candidates will be given special consideration for extensive experience with Python.
-Ability to develop visualizations utilizing Tableau or PowerBI.
-Experience in developing Shell scripts on Linux.
-Demonstrated experience translating business and technical requirements into comprehensive data strategies and analytic solutions.
-Demonstrated ability to communicate across all levels of the organization and communicate technical terms to non-technical audiences.
Our Commitment:
Contact Government Services (CGS) strives to simplify and enhance government bureaucracy through the optimization of human, technical, and financial resources. We combine cutting-edge technology with world-class personnel to deliver customized solutions that fit our client's specific needs. We are committed to solving the most challenging and dynamic problems.
For the past seven years, we've been growing our government-contracting portfolio, and along the way, we've created valuable partnerships by demonstrating a commitment to honesty, professionalism, and quality work.
Here at CGS we value honesty through hard work and self-awareness, professionalism in all we do, and to deliver the best quality to our consumers mending those relations for years to come.
We care about our employees. Therefore, we offer a comprehensive benefits package:
-Health, Dental, and Vision
-Life Insurance
-401k
-Flexible Spending Account (Health, Dependent Care, and Commuter)
-Paid Time Off and Observance of State/Federal Holidays
Contact Government Services, LLC is an Equal Opportunity Employer. Applicants will be considered without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran.
Join our team and become part of government innovation!
Explore additional job opportunities with CGS on our Job Board:
*************************************
For more information about CGS please visit: ************************** or contact:
Email: *******************
$144,768 - $209,109.33 a year
Data Engineer (only W2 contract no C2C)
Data Scientist Job 15 miles from Pelham
HI,
Hope you're doing well
This is Pankaj from 4P Consulting please see details below job description
Note :: Only W2 NO C2C allow please don't send any resume for C2C
Job Title: Data Engineer - Advanced Metering Infrastructure (AMI) Analytics**
Location: Birmingham, AL
Contract :: 12 Months
Job Summary:
The Data Engineer at APC Power Delivery's AMI Analytics team plays a pivotal role in advancing data-driven decision-making and operational efficiency across business units. Working closely with data scientists and within a cloud environment, this role requires expertise in data engineering, cloud technologies, and collaborative problem-solving.
Key Responsibilities:
Data Pipeline Development:
· Assist TO in designing, building, and maintaining scalable data pipelines to ingest, transform, and store large volumes of AMI data from various sources.
· Collaborate with data scientists to ensure seamless integration of data pipelines with analytical models and AI processes.
Data Modeling and Architecture:
· Develop and implement data models and architectures optimized for AMI data analytics, ensuring efficiency, scalability, and data integrity.
· Implement best practices for data storage, partitioning, and indexing to optimize performance and facilitate analysis.
Cloud Environment Management:
· Work within cloud environments - Azure to deploy, manage, and optimize data engineering solutions.
· Collaborate with cloud architects and administrators to ensure security, compliance, and cost-effectiveness of cloud infrastructure.
Power BI Visualization and Reporting:
· Develop and maintain Power BI dashboards and reports to visualize AMI data insights and facilitate data-driven decision-making.
· Ensure the accuracy, reliability, and usability of visualizations to meet business requirements.
Data Cataloging and Documentation:
· Catalog and document AMI data sources, datasets, and metadata to facilitate data discovery, lineage, and governance.
· Implement data cataloging best practices to ensure the availability and accessibility of AMI data assets.
Collaboration and Support:
· Collaborate with cross-functional teams to understand data requirements and support analytical initiatives.
· Provide technical support and troubleshooting for data-related issues, ensuring the reliability and availability of data infrastructure.
Job Requirements:
Education/Experience:
· Bachelor's degree in Computer Science, Information Technology, or related field.
· 3+ years of experience in data engineering or related roles, preferably in a cloud environment.
Knowledge, Skills & Abilities:
· Proficiency in programming languages such as Python, SQL, and Scala.
· Proficiency with Power BI for data visualization and reporting.
· Experience with cloud-based data platforms (e.g., Azure Databricks, AWS EMR).
· Strong understanding of data modeling, ETL processes, and data warehousing concepts.
· Familiarity with big data technologies
Behavioral Attributes:
· Innovative and adaptable, with a passion for continuous learning and improvement.
· Strong communication and collaboration skills, with the ability to work effectively across teams.
· Results-oriented and committed to delivering high-quality solutions that meet business needs.
· Ethical conduct and commitment to safety in all aspects of work.
We are looking for a motivated and skilled Data Engineer to join our dynamic AMI Analytics team. If you are passionate about leveraging data to drive business insights and innovation, and have expertise in Power BI visualization and data cataloging, we encourage you to apply.
Thanks and Regards
Sr. Talent Acquisition Specialist
Data Engineer
Data Scientist Job 15 miles from Pelham
We are seeking a skilled Data Engineer to support our inpatient physical therapy operations by designing, maintaining, and optimizing data infrastructure. This role will focus on ensuring efficient data collection, processing, integration, and analysis to enhance patient care, operational efficiency, and business insights.
We are a company committed to creating inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity employer that believes everyone matters. Qualified candidates will receive consideration for employment opportunities without regard to race, religion, sex, age, marital status, national origin, sexual orientation, citizenship status, disability, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to Human Resources Request Form (****************************************** Og4IQS1J6dRiMo) . The EEOC "Know Your Rights" Poster is available here (*********************************************************************************************** .
To learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: *************************************************** .
Skills and Requirements
- 3+ years of SQL Experience (PL/SQL preferred): Expertise in SQL, particularly PL/SQL, is crucial for querying and managing data effectively.
- 3+ years of Oracle Database Experience: A solid foundation in database management is essential for this role, ensuring efficient data storage, retrieval, and manipulation.
- Working Experience Moving Data Sets: Proficiency in handling and migrating data sets is required, with a preference for experience in TypeScript or Python (Pie Spark). - Strong background in developing and optimizing Apex code.
- Knowledge or experience working with LLM (Large Language Models) like Chat CPT, etc.
- Any Working AI Experience: Exposure to AI technologies and their practical applications is highly desirable.
- Database Analyst Experience: Previous experience as a database analyst will be beneficial.
- Palantirs Foundry Application Experience: Experience with Palantir's Foundry application will set you apart from other candidates. null
We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal employment opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment without regard to race, color, ethnicity, religion,sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military oruniformed service member status, or any other status or characteristic protected by applicable laws, regulations, andordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or the recruiting process, please send a request to ********************.
AI & GenAI Data Scientist - Manager
Data Scientist Job 15 miles from Pelham
Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Manager At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals.
In data analysis at PwC, you will focus on utilising advanced analytical techniques to extract insights from large datasets and drive data-driven decision-making. You will leverage skills in data manipulation, visualisation, and statistical modelling to support clients in solving complex business problems.
Enhancing your leadership style, you motivate, develop and inspire others to deliver quality. You are responsible for coaching, leveraging team member's unique strengths, and managing performance to deliver on client expectations. With your growing knowledge of how business works, you play an important role in identifying opportunities that contribute to the success of our Firm. You are expected to lead with integrity and authenticity, articulating our purpose and values in a meaningful way. You embrace technology and innovation to enhance your delivery and encourage others to do the same.
Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to:
* Analyse and identify the linkages and interactions between the component parts of an entire system.
* Take ownership of projects, ensuring their successful planning, budgeting, execution, and completion.
* Partner with team leadership to ensure collective ownership of quality, timelines, and deliverables.
* Develop skills outside your comfort zone, and encourage others to do the same.
* Effectively mentor others.
* Use the review of work as an opportunity to deepen the expertise of team members.
* Address conflicts or issues, engaging in difficult conversations with clients, team members and other stakeholders, escalating where appropriate.
* Uphold and reinforce professional and technical standards (e.g. refer to specific PwC tax and audit guidance), the Firm's code of conduct, and independence requirements.
Minimum Degree Required
Bachelor's Degree
Minimum Year(s) of Experience
7 year(s)
Demonstrates extensive-level abilities and/or a proven record of success managing the identification and addressing of client needs:
* Managing development teams in building of AI and GenAI solutions, including but not limited to analytical modeling, prompt engineering, general all-purpose programming (e.g., Python), testing, communication of results, front end and back-end integration, and iterative development with clients
* Documenting and analyzing business processes for AI and Generative AI opportunities, including gathering of requirements, creation of initial hypotheses, and development of AI/GenAI solution approach
* Collaborating with client team to understand their business problem and select the appropriate models and approaches for AI/GenAI use cases
* Designing and solutioning AI/GenAI architectures for clients, specifically for plugin-based solutions (i.e., ChatClient application with plugins) and custom AI/GenAI application builds
* Managing teams to process unstructured and structured data to be consumed as context for LLMs, including but not limited to embedding of large text corpus, generative development of SQL queries, building connectors to structured databases
* Managing daily operations of a global data and analytics team on client engagements, review developed models, provide feedback and assist in analysis;
* Directing data engineers and other data scientists to deliver efficient solutions to meet client requirements;
* Leading and contributing to development of proof of concepts, pilots, and production use cases for clients while working in cross-functional teams;
* Facilitating and conducting executive level presentations to showcase GenAI solutions, development progress, and next steps
* Structuring, write, communicate and facilitate client presentations; and,
* Managing associates and senior associates through coaching, providing feedback, and guiding work performance.
Demonstrates extensive abilities and/or a proven record of success learning and performing in functional and technical capacities, including the following areas:
* Managing GenAI application development teams including back-end and front-end integrations
* Using Python (e.g., Pandas, NLTK, Scikit-learn, Keras, etc.), common LLM development frameworks (e.g., Langchain, Semantic Kernel), Relational storage (SQL), Non-relational storage (NoSQL);
* Experience in analytical techniques such as Machine Learning, Deep Learning and Optimization
* Vectorization and embedding, prompt engineering, RAG (retrieval, augmented, generation) workflow dev
* Understanding or hands on experience with Azure, AWS, and / or Google Cloud platforms
* Experience with Git Version Control, Unit/Integration/End-to-End Testing, CI/CD, release management, etc.
Travel Requirements
Up to 80%
Job Posting End Date
Learn more about how we work: **************************
PwC does not intend to hire experienced or entry level job seekers who will need, now or in the future, PwC sponsorship through the H-1B lottery, except as set forth within the following policy: ***********************************
As PwC is an equal opportunity employer, all qualified applicants will receive consideration for employment at PwC without regard to race; color; religion; national origin; sex (including pregnancy, sexual orientation, and gender identity); age; disability; genetic information (including family medical history); veteran, marital, or citizenship status; or, any other status protected by law.
For only those qualified applicants that are impacted by the Los Angeles County Fair Chance Ordinance for Employers, the Los Angeles' Fair Chance Initiative for Hiring Ordinance, the San Francisco Fair Chance Ordinance, San Diego County Fair Chance Ordinance, and the California Fair Chance Act, where applicable, arrest or conviction records will be considered for Employment in accordance with these laws. At PwC, we recognize that conviction records may have a direct, adverse, and negative relationship to responsibilities such as accessing sensitive company or customer information, handling proprietary assets, or collaborating closely with team members. We evaluate these factors thoughtfully to establish a secure and trusted workplace for all.
Applications will be accepted until the position is filled or the posting is removed, unless otherwise set forth on the following webpage. Please visit this link for information about anticipated application deadlines: ***************************************
The salary range for this position is: $100,000 - $232,000, plus individuals may be eligible for an annual discretionary bonus. For roles that are based in Maryland, this is the listed salary range for this position. Actual compensation within the range will be dependent upon the individual's skills, experience, qualifications and location, and applicable employment laws. PwC offers a wide range of benefits, including medical, dental, vision, 401k, holiday pay, vacation, personal and family sick leave, and more. To view our benefits at a glance, please visit the following link: ***********************************
Lead Data Scientist
Data Scientist Job 15 miles from Pelham
Do you have a strong background in machine learning and deep learning? Are you interested in utilizing your data science skills and working with a small team in a fast-paced environment to achieve strategic mission goals? If so, Deloitte has an exciting opportunity for you!
The Team:
The GPS GSi group at Deloitte is dedicated to driving innovation and efficiency through advanced data science and business intelligence solutions. Our team collaborates closely with Enabling Area professionals to develop and implement cutting-edge machine learning, deep learning, and generative AI initiatives. We thrive in a dynamic environment where teamwork and independent problem-solving are key to achieving our strategic mission goals. Join us to be part of a small, agile team that is at the forefront of transforming decision-making processes and delivering impactful solutions.
Recruiting for this role ends on July 31st, 2025.
What You'll Do:
As a member of our GPS GSi group, you will play a crucial role in the development and maintenance of our data science and business intelligence solutions. This role will specialize in assisting with machine learning, deep learning, and generative AI initiatives that will be utilized by Enabling Area professionals to enhance and expedite decision-making. You will provide expertise within and across business teams, demonstrate the ability to work independently, and apply problem-solving skills to resolve complex issues.
+ Lead and participate in developing data science products, transforming client needs into quantifiable solutions.
+ Independently carry out tasks, using critical thinking and problem-solving skills to devise effective solutions.
+ Design, train, and deploy machine learning and deep learning models on platforms like AWS, Databricks, and Dataiku.
+ Develop and advise on Large Language Model (LLM) solutions for enterprise-wide documentation, including RAG, CPT, and SFT.
+ Utilize MLOps pipelines, including containerization (Docker) and CI/CD, for training and deploying models.
+ Write clean, well-commented code for easy collaboration and maintain structured project documentation using GitHub and/or Jira.
Qualifications
Required skills:
+ Bachelor's Degree in Statistics, Mathematics, Computer Science, Engineering, or another analytical field.
+ 6+ years of experience in data science, with deep knowledge of Python, machine learning, deep learning, and related packages (e.g., sklearn, TensorFlow, PyTorch). Proven ability to lead data science projects from inception to deployment.
+ Strong knowledge of Large Language Models (LLMs) and Retrieval-Augmented Generation (RAG).
+ Familiarity with AWS, Databricks, and/or Dataiku platforms.
+ Working knowledge of MLOps, including containerization (e.g., Docker).
+ Strong organizational skills, with clear project documentation and the ability to write clean code.
+ Familiarity with agile project methodology and project development lifecycle.
+ Experience with GitHub for version control.
+ Ability to manage multiple detailed tasks and responsibilities simultaneously, meeting deadlines and objectives accurately.
+ Must be legally authorized to work in the United States without employer sponsorship, now or in the future.
+ Ability to travel 10%-25%, on average, based on the work you do and the clients and industries/sectors you serve
Preferred Skills:
+ Master's Degree in Statistics, Mathematics, Computer Science, Engineering, or another analytical field, or equivalent direct work experience.
+ Significant experience with MLOps and associated serving frameworks (e.g., Flask, FastAPI) and orchestration pipelines (e.g., SageMaker Pipelines, Step Functions, Metaflow).
+ Significant experience working with open-source LLMs, including serving via TGI/vLLM and performing Continued Pre-training (CPT) and/or Supervised Fine-tuning (SFT).
+ Experience using various AWS Services (e.g., Textract, Transcribe, Lambda, etc.).
+ Proficiency in basic front-end web development (e.g., Streamlit).
+ Knowledge of Object-Oriented Programming (OOP) concepts.
The wage range for this role takes into account the wide range of factors that are considered in making compensation decisions including but not limited to skill sets; experience and training; licensure and certifications; and other business and organizational needs. The disclosed range estimate has not been adjusted for the applicable geographic differential associated with the location at which the position may be filled. At Deloitte, it is not typical for an individual to be hired at or near the top of the range for their role and compensation decisions are dependent on the facts and circumstances of each case. A reasonable estimate of the current range is $97,600 to $179,900
You may also be eligible to participate in a discretionary annual incentive program, subject to the rules governing the program, whereby an award, if any, depends on various factors, including, without limitation, individual and organizational performance.
Information for applicants with a need for accommodation: ************************************************************************************************************
EA_ExpHire, EA_GPS_ExpHire, #LI-JK2
All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, age, disability or protected veteran status, or any other legally protected basis, in accordance with applicable law.
Data Engineer II, TO
Data Scientist Job 15 miles from Pelham
ASCEND Program Information
Southern Company is committed to building the future of energy for the customers who depend on us, the communities we serve and the industry we lead. The ASCEND program will transform how we do business, helping us elevate the customer experience, adapt to industry changes, and implement technology that offers new capabilities with more agility. It will involve implementing multiple applications, including the replacement of our meter data management (MDM) and customer service systems (CSS) with a new, Oracle customer information system (CIS) called customer to meter (C2M). Additionally, it includes implementing an advanced analytics platform (AAP), customer experience (CX) and customer engagement platform (CEP).
The new platform, to be rolled out to Alabama Power, Georgia Power, and Mississippi Power, will be implemented in phases over the next four years. Currently, we are building an ASCEND organization dedicated to helping the electric operating companies realize the vision for a modernized, efficient, digital customer experience to deeply engage with our customers and provide a more personalized experience. ASCEND will be one of the most comprehensive Customer Service and Marketing transformation initiatives in our company's history.
Each Leader, Technology (CIS/MDM) position is responsible for the oversight, management, and delivery of critical activities within the Technology Workstream required for the program.
Click here for more information about the CIS project and to view FAQs.
The location for this position is flexible between Birmingham or Atlanta.
The successful candidate will remain an employee of the operating company while reporting to the CIS/MDM Project Team and may report to a SCS or operating company leader for delivery. There will be alignment for technology employees to the technology leaders and business employees to business leaders
About Us
Southern Company is upgrading its Customer Information System (CIS) to better serve 4.6 million electricity customers. The new system leverages Databricks' lakehouse architecture and Azure's cloud services to enhance data management, streamline reporting, and empower advanced analytics, data science, and artificial intelligence initiatives.
Alongside these modern tools, we are assembling an agile and collaborative data management practice focused on delivering business-aligned objectives through robust, accurate, and scalable data products and features. Our innovative customer data solutions prioritize strong governance, regulatory compliance, and user-centric design, driving exceptional customer experiences and operational excellence.
The Customer Data Platform (CDP) is a data lake house built atop Databricks within Microsoft's Azure cloud. Information relative to Southern Company's customers is managed in CDP to produce data products and features used for customer and partner facing data projects, reporting and analytics. CDP rapidly puts Southern Company's customer information to work towards the achievement of the greatest value for our customers and shareholders.
Job Summary
We are seeking a skilled and motivated Data Engineer to join our team and contribute to the development and maintenance of our data lake house. The ideal candidate will have a strong background in data engineering, cloud platforms (specifically Azure), and data analytics. This role will involve designing, implementing, and optimizing data pipelines to ensure the scalability, security, and performance of our data lake house, meeting the needs of our business and customers.
Key Responsibilities
Design, develop, and maintain data pipelines and ETL/ELT processes for the data lake house on Databricks in Azure.
Collaborate with stakeholders to understand data requirements and translate them into technical solutions.
Ensure data quality, consistency, and security across all data sources and pipelines.
Implement best practices for data governance, data management, and data lifecycle processes.
Optimize the performance, scalability, and cost-efficiency of data workflows and storage solutions.
Work closely with data architects, data analysts, and other team members to ensure seamless integration and operation of the data lake house.
Stay current with industry trends, emerging technologies, and best practices in data engineering and analytics.
Troubleshoot and resolve data-related issues, ensuring high levels of system availability and reliability.
Document data workflows, processes, and technical solutions for reference and knowledge sharing.
Qualifications
Bachelor's degree in computer science, Information Technology, Data Science, or a related field.
Minimum of 3 years of experience in data engineering, data architecture, or related roles.
Proven experience with Databricks, Azure Data Lake, Azure Synapse, and other relevant Azure services.
Strong understanding of data warehousing, ETL/ELT processes, and data modeling.
Experience with big data technologies such as Apache Spark, Delta Lake, and related frameworks.
Excellent analytical and problem-solving skills, with the ability to troubleshoot and resolve data-related issues.
Strong communication and interpersonal skills, with the ability to collaborate effectively with technical and non-technical stakeholders.
Knowledge of the electric utility industry is a plus.
Preferred Skills
Certification in data management or analytics (e.g., CDMP, CBIP).
Experience with machine learning and advanced analytics techniques.
Behavioral Attributes
Opportunistic Drive - Committed to delivering technology solutions that help Southern Company achieve their business imperatives and driven to identify opportunities to do so within the solutions, information and data we're stewards of.
Positive Can-Do Attitude - Must be willing to take full responsibility for duties and work effectively under the pressure of deadlines and shifting priorities.
Self-Starter - Able to work in a professional environment with limited direct supervision.
Results-Oriented - Acts with speed and decisiveness; takes initiative does what it takes to meet commitments.
Safety Focused - Accepts responsibility for the safety of yourself and co-workers.
Commitment to continuous learning and improvement - Stays abreast of new technologies and techniques in the market; Looks for opportunities improve through strategy and innovation.
This job description does not, nor is it intended to represent an exhaustive listing of all duties, tasks or responsibilities for the position listed. Based upon individual experience and team workload, responsibilities are assigned in varying degrees.
Data Engineer
Data Scientist Job 15 miles from Pelham
We are looking for a highly skilled Data Engineer to join our team. The ideal candidate will have a strong background in data management, SQL, Python, and data visualization tools like Power BI. With 5+ years of experience working with data in an enterprise environment, you will design, develop, and maintain scalable data pipelines and systems. This role involves collaborating with cross-functional teams to understand data requirements and deliver effective solutions.
Qualifications:
Education: Bachelor’s degree in Computer Science, Information Technology, or a related field.
Experience:
5+ years of experience working with data in an enterprise environment.
5+ years of hands-on experience with SQL queries and Python.
Proven experience with Power BI and Oracle databases.
Strong understanding of data modeling, ETL processes, and data warehousing.
Excellent problem-solving skills and attention to detail.
Strong communication and collaboration skills, with the ability to work independently and as part of a team.
Key Responsibilities:
Design & Develop Scalable Data Pipelines:
Develop and maintain efficient, scalable data pipelines and systems to handle large datasets.
Collaborate with Cross-functional Teams:
Work closely with teams across the organization to understand data requirements and provide tailored solutions.
SQL Query Optimization:
Optimize and troubleshoot SQL queries for performance, ensuring efficiency and effectiveness.
ETL Process Development:
Design and maintain ETL processes using Python to extract, transform, and load data from various sources.
Data Integration & Management:
Integrate and manage data from multiple sources, including Oracle databases, to ensure seamless data flow.
Report & Dashboard Creation:
Create and maintain interactive reports and dashboards using Power BI to provide actionable insights.
Ensure Data Quality & Security:
Monitor data integrity, quality, and security, ensuring that all data meets the company’s standards.
Data Analysis & Support for Business Decisions:
Perform data analysis to support decision-making and provide insights that drive business strategies.
Stay Updated with Industry Trends:
Keep up-to-date with the latest industry trends, technologies, and best practices to continuously improve data solutions.
Why Join Us?
Impactful Work:
Be part of a team that values data-driven decision-making and supports innovation.
Career Growth:
Opportunities for professional development and career progression in a fast-paced, dynamic environment.
Collaborative Culture:
Work in a supportive and collaborative team atmosphere with access to cutting-edge tools and technologies.
Data Engineer/Analyst - Business Process & Innovation
Data Scientist Job 15 miles from Pelham
The Data Engineer role will provide critical support to stakeholders of supported applications and to other members of the Business Process & Innovation team. This role will be responsible for developing data management and governance processes, creating and supporting automated data pipelines and executing data cleanup for applications supported by the team. This position requires a strong blend of collaboration, technical proficiency, and problem-solving abilities to ensure data accessibility, quality and reliability.
**JOB RESPONSIBILITIES**
+ Design, develop and maintain robust and scalable data pipelines to ingest, process and store large volumes of structured and unstructured data
+ Collaborate with cross-functional teams to understand data requirements and translate them into technical specifications
+ Implement and optimize ETL (Extract, Transform, Load) processes to ensure efficient data flow and transformation
+ Endure data quality and integrity by implementing data validation an cleansing processes
+ Monitor and troubleshoot data pipeline performance, identifying and resolving issues proactively
+ Develop and maintain data models and schemas to support analytics and reporting needs
+ Work with cloud-based and on-prem data platforms and tools to manage and process data efficiently
+ Stay up to date with emerging data engineering technologies and best practices to drive continuous improvement
+ Ensure compliance with data governance and security policies
**JOB REQUIREMENTS**
**Education Requirements:**
+ Bachelor's degree in computer science, Engineering, Information Technology or a related field. Advanced degrees are a plus.
**Experience Requirements:**
+ Experience in data engineering, software development, or a related field
+ Proven experience with data pipeline tools
+ Experience with SQL and database technologies
+ Experience with big data tools and technologies
+ Familiarity with cloud data platforms (AWS, Azure)
**Knowledge, Skills & Abilities:**
+ Proficiency in programming languages such as Python or Java
+ Strong understanding of data warehousing concepts and best practices
+ Experience with data modeling, architecture and integration
+ Excellent problem solving skills and attention to detail
+ Strong communication skills
+ Ability to work with a broad team
+ Knowledge of data security and privacy best practices
+ Familiarity with machine learning and AI tools is a plus
**Behavioral Attributes:**
Model "Our Values": Safety First, Intentional Inclusion, Act with Integrity and Superior Performance
+ Passionate, demonstrable commitment to safety.
+ Values and takes personal initiative to support a diverse and inclusive workplace.
+ Champions teamwork and upholds the decisions of the team.
+ Maintains a customer-focused mindset and achieves excellence in customer satisfaction.
+ Unwavering adherence to the Southern Company Code of Ethics.
+ Demonstrates a commitment to the pursuit of continuous improvement.
+ Demonstrates a commitment to Innovation - identifies new opportunities to improve products & services then develops and implements plans to achieve results.
**About Southern Company**
Southern Company (NYSE: SO ) is a leading energy provider serving 9 million customers across the Southeast and beyond through its family of companies. Providing clean, safe, reliable and affordable energy with excellent service is our mission. The company has electric operating companies in three states, natural gas distribution companies in four states, a competitive generation company, a leading distributed energy solutions provider with national capabilities, a fiber optics network and telecommunications services. Through an industry-leading commitment to innovation, resilience and sustainability, we are taking action to meet customers' and communities' needs while advancing our goal of net-zero greenhouse gas emissions by 2050. Our uncompromising values ensure we put the needs of those we serve at the center of everything we do and are the key to our sustained success. We are transforming energy into economic, environmental and social progress for tomorrow. Our corporate culture has been recognized by a variety of organizations, earning the company awards and recognitions that reflect Our Values and dedication to service. To learn more, visit *********************** .
Southern Company invests in the well-being of its employees and their families through a comprehensive total rewards strategy that includes competitive base salary, annual incentive awards for eligible employees and health, welfare and retirement benefits designed to support physical, financial, and emotional/social well-being. This position may also be eligible for additional compensation, such as an incentive program, with the amount of any bonus/awards subject to the terms and conditions of the applicable incentive plan(s). A summary of the benefits offered for this position can be found here **************************************************** . Additional and specific details about total compensation and benefits will also be provided during the hiring process.
Southern Company is an equal opportunity employer where an applicant's qualifications are considered without regard to race, color, religion, sex, national origin, age, disability, veteran status, genetic information, sexual orientation, gender identity or expression, or any other basis prohibited by law.
Job Identification: 12693
Job Category: Engineering
Job Schedule: Full time
Company: Southern Company Services