How have generative AI tools changed your workflow at the Federal Reserve? What is your opinion on their impact — both on your own work and on the data analytics industry more broadly?
You participated in Fed Challenge and FDIC Challenge during college. How did those experiences influence your learning, professional development, and career preparation?
What kinds of backgrounds do your colleagues have at the Federal Reserve? Do they share similar academic paths to yours, or do they come from different disciplines?
When you were job searching and interviewing, what qualities or skills did employers seem to value the most? Beyond technical ability, what stood out as important during your conversations with them?
Jaehyung (AI Enablement / Analytics at Invisible Technologies)
Works with AI model companies like OpenAI, Cohere, Google DeepMind
Manages data pipelines, automation, and analytics with 20,000+ distributed agents
Emphasized Streamlit apps, personal data projects, and building tools beyond class assignments
🧩 Roles & Workflows
Jason (Data Engineer at Momentive)
Works in IT data engineering, integrating ERP, Workday, travel, HR data into Snowflake
Builds Tableau dashboards and automated data pipelines for business decision-making
Transitioned from internship → full-time by demonstrating value through small data tools
💡 Lessons on Career Preparation
Personal Projects > Just Coursework
Build one deep project you can talk about—more powerful than many shallow demos
Internships: Ideally by junior year, but personal projects + outreach can substitute
Networking: Reaching out to alumni on LinkedIn works—alumni want to help
Data Career Reality: Messy data, stakeholder communication, iteration, and revisions matter more than “perfect code”
🤖 AI & Modern Workflow Changes
60–80% of analytics/engineering code now written with AI assistants (Claude, Copilot)
“Vibe coding” → prompting AI to generate logic flows instead of hand-coding everything
Still requires human review → AI amplifies analysts, not replaces them
Data sensitivity & responsible AI use are now core skills (Copilot within secure environment)
🧭 Mindset & Professional Growth
Don’t just execute requests — ask “why” and co-design solutions with stakeholders
Avoid taking feedback personally — debugging business logic is normal
Stay adaptable — small teams + AI enable startup-style speed, even inside large companies
✅ Skills & Tools to Prioritize
Coding Fundamentals Matter — Python, SQL, R (to understand and correct AI-generated code)
Streamlit / App Building — both alumni mentioned it as a direct advantage
Data Pipeline Understanding — Snowflake, ETL logic, API data ingestion
💡 API (Application Programming Interface): A structured way for one software or system to communicate with another — allowing programs to request data or trigger actions automatically, without using a manual interface (e.g., a browser).
Example: The Google Maps API lets applications access location, routing, and distance data programmatically.
Dashboarding / Visualization — Tableau or Shiny (used in internship-to-job pipeline)
🧠 Real Data Work = Beyond Clean CSVs
Messy, inconsistent corporate data is normal
Analysts must reshape, clean, validate, and question data
UAT (User Acceptance Testing) → users will question your output; don’t take it personally
Communication = Core skill
💬 Communication & Stakeholder Logic
Don’t just deliver what’s asked — ask “What decision will this data support?”
Stakeholders often don’t know what they need
You become the data advisor, not just the “SQL person”
Work smart, not just hard — clarify goals before coding dashboards
🧪 Personal Projects = Career Currency
Quality > Quantity — one deep project beats five generic Kaggle plots
Kaggle — a platform for data science competitions, datasets, and collaborative notebooks.
Jaehyung used a self-tracked life project to stand out in interviews
Use class projects as foundation, then extend them your own way
🎓 Internships & Entry Points
Common internship timing: Summer before senior year
No internship? → personal projects + LinkedIn outreach still work, but having internship experience would be more beneficial
Alumni emphasized: “If a Geneseo student messages me, I feel responsible to reply.”
🤖 How AI Actually Fits Into Data Jobs
80% of daily code now AI-assisted
“Vibe coding” = prompting AI to write working SQL/Python snippets
Human oversight remains essential
Company policies differ — many only allow Copilot or internal AI
🛡 Data Ethics & Security Awareness
Never paste confidential data into public AI tools
Use secure AI (e.g., Microsoft Copilot) within company systems
Understanding governance & privacy = part of professional data practice
🧭 Career Mindset Recalibration
Big Tech ≠ only path — small, AI-enabled startups thrive
Solo founders can build functional apps with AI + data pipelines
“Best time in history to build something—don’t just consume.”
Summary for Oliver’s Session
💼 Oliver’s Career Path
Joined the Statistics Department at the Federal Reserve Bank of New York
Department Role:
Collects and processes regulatory data from banks
Ensures data accuracy and detects anomalies
Communicates with reporting institutions to verify discrepancies
Uses Python, SQL, and Excel daily
Transitioning from Alteryx to Python workflows
💡 Note:Alteryx is a visual data analytics platform that allows users to prepare, clean, and analyze data through a drag-and-drop interface — often used in industry for building workflows without extensive coding.
🧠 Daily Responsibilities
Maintain and analyze regulatory datasets (hundreds of banks, monthly reports)
Perform data transformation, cleaning, and aggregation
Identify anomalies and follow up with banks for clarification
Develop efficient data pipelines and quality assurance checks
🤖 Generative AI in the Workplace
The Fed prohibits public AI tools (e.g., ChatGPT) due to data sensitivity
Uses a secure in-house generative AI system
Oliver’s perspective:
AI is excellent for routine transformations and debugging
Analysts still need domain understanding and context awareness
“You need to know what you’re asking it to do—and what the right answer looks like.”
🧮 Choosing Majors and Minors
Initially unsure of major; inspired by a pop economics book (e.g., Freakonomics)
Chose Economics for its analytical and real-world reasoning value
Added Mathematics and Data Analytics:
Enjoyed quantitative reasoning and problem-solving
The Data Analytics minor was newly introduced (2022)