Friday, January 2, 2026

 

10-27-2025 Doyle’s Tech Talk Lecture: Evolution of Computing, NLP Fundamentals, and Practical AI Use Cases


Date & Time: 2025-10-27 01:30 PM
Location: Brookdale Skyline Auditorium (SKR)
Lecture: Evolution of Computing, NLP Fundamentals, and Practical AI Use Cases

NLP and statistics Prompt engineering Python ecosystem

Theme

A veteran engineer traces computing from mainframes and Fortran through VBA and Python to today’s generative AI, emphasizing his personal journey and exposure to computers and languages. The talk explains Natural Language Processing (NLP) as large-scale statistical modeling, why neural networks demand heavy compute power, and how natural language prompts guide AI outputs. It covers practical tools for transcription and summarization, cautions on AI reliability, and plans for hands-on demos across various generative AI platforms like ChatGPT, Copilot, Gemini, and Claude. The session ties historical context from the speaker's nearly 40-year career to modern AI practices and resources, highlighting the ability to communicate with computers through spoken language.

Takeaways

  1. Lecturer’s background: Considers Pueblo, Colorado his hometown; graduated from South High School as a member of its first graduating class in 1960; attended two years at Pueblo Junior College; subsequently transferred to CU Boulder, earning a degree in electrical engineering with an electronics emphasis in two years. His single computer class was in the spring semester of his senior year, using a vacuum-tube version IBM 709, before a computer science department even existed.
  2. Early computing exposure: Observed the IBM 709 behind glass doors with its big bank of lights flickering during processing; the comprehensive course introduced the structure of computers and the idea of computer programming, covering assembly language (the actual instructions a computer executes) and higher-order languages like an early version of Fortran 2 and MAD (Michigan Algorithm Decoder), an early interpreter language. The course also covered the differences between interpreters, compilers, and assemblers, which relate to the level of interaction with the computer.
  3. Career path: Based on this single computer course, he secured his first job with Control Data Corporation in Minneapolis, Minnesota. He lasted about a year and a half, finding it "wasn't quite what I expected for a career," partly due to severe winters (below zero for a week straight, little sun). He moved back to Colorado, working with defense department contractors on early systems for Cheyenne Mountain space surveillance and satellite tracking. He wrote code to perform tasks associated with maintaining catalog satellites, a career he pursued for nearly 40 years.
  4. Programming languages over time: Exposed to many evolving computer languages, including Fortran 2 and Fortran 4, writing a lot of Fortran code. Fortran evolved significantly over his career, incorporating capabilities from other languages like pointers and structures; he believes Fortran is still around, the latest version is named "Fortran 2023." He also encountered a series of interpreter languages beyond MAD, such as ALGOL, SNOBOL (spoken as SNOWBALL), and COBOL (Common Business Oriented Language), noting he studied COBOL but never used it for a job. More sophisticated quasi-assembly language programs like C and C++ emerged, described as more object-oriented and having less rigorous syntax.
  5. Visual Basic for Applications (VBA) in Excel: Worked with Visual Basic, the language "underneath our modern spreadsheet Excel," for writing macros. He developed a complex Excel-based system of macros that would "scrape Fidelity's website" to pull information for stock analysis, applying statistical math and interfacing with other components. This system became unusable when Fidelity changed their website, illustrating the dynamic and varied evolution of programming concepts and tools.
  6. Core thesis on modern AI: The advent of generative text AIs like ChatGPT, Copilot, Gemini, and Claude 4 has created the ability to communicate with computers through spoken language. Prompts are the interface for these AIs, and the more specific a prompt, the more specific the answer will be.
  7. Two initial questions about ChatGPT: When first interested in ChatGPT, he had two fundamental questions: (1) How does it understand and manipulate language? (2) Why does it consume so much compute power?
  8. NLP understanding: Natural Language Processing (NLP) is the computer science field that digitizes language into a form computers can manipulate. He drew an analogy to satellite orbit determination: classical Keplerian orbits could be represented with six basic, purely analytical parameters, extended to nine parameters with additions for atmospheric drag, solar radiation pressure, and three dimensional thrusts. In contrast, NLP uses vastly more parameters, such as 2.67 billion, to represent and manipulate language. He emphasizes that AI, at its heart, is "all statistics."
  9. Statistical methods: Satellite orbit determination used statistical processes like differential correction least squares and probability to fit orbits to observations and make predictions. Similarly, NLP employs statistical approaches, but with a much greater number of parameters. AI is framed as sophisticated statistical manipulation, not a single "do AI" instruction, but rather "very complicated, evolved, sophisticated statistics."
  10. Python ecosystem: Python has evolved as the language to support AI, despite the speaker not knowing "why it’s called Python." Its basic language concept is simple, but its capabilities are extended through various libraries (e.g., for math, as basic math instructions are not built into the core language). Users can import or fabricate their own libraries, making its capabilities "literally unlimited." Python's interpretive nature allows it to modify itself while being executed.

Highlights

"At the heart of it, it’s just statistics — very complicated, evolved, sophisticated statistics — but still statistics."

Chapters & Topics

Historical evolution of programming exposure and languages

The lecturer’s journey from early mainframe-era computing through decades of professional programming, illustrating how languages and paradigms evolved from assembly and early Fortran to modern high-level and object-oriented languages, and into end-user programming via VBA.

·        Keypoints

o   IBM 709 vacuum tube version computer used during a senior-year course before a CS department existed.

o   Assembly language as direct machine instruction interface contrasted with higher-order languages (Fortran 2, MAD).

o   Interpreter vs compiler vs assembler distinctions based on interaction level with the machine.

o   Fortran evolution (Fortran 2, Fortran 4) and adoption of capabilities like pointers and structures; assertion that modern day Fortran persists as “Fortran 2023”).

o   Exposure to ALGOL, SNOBOL (spoken as SNOWBALL), COBOL (Common Business Oriented Language, studied but not used professionally), C, C++ (object-oriented, less rigid syntax).

o   VBA enabling Excel macros and full program-like systems.

·        Explanation
The lecturer details a chronological path: an early, comprehensive course introduced assembly and high-level languages, setting up employability at Control Data. Over his nearly 40-year career, Fortran and other languages were used in defense and satellite tracking contexts. Interpreter languages (MAD, ALGOL, SNOBOL) differ in runtime behavior from compiled languages (Fortran, C). C and C++ introduced object orientation and less rigid syntax demands. In end-user computing, VBA underpins Excel macros, enabling tooling that can be sophisticated enough to scrape websites and perform statistical analysis, though such systems are brittle to external changes (e.g., site redesigns).

·        Examples

        Upon moving to Skyline, the lecturer built a set of Visual Basic programs in Excel to scrape Fidelity’s website for stock analysis. The system applied statistical math and interfaced with other components, growing complex over time. When Fidelity changed their website, scraping stopped working and the project became unusable.

o   Identify data source (Fidelity pages) and target metrics for stock analysis.

o   Prototype VBA macros to navigate and extract HTML content.

o   Iteratively add statistical routines and integrations as complexity grows.

o   Monitor for site changes; when DOM structure/endpoints changed, scraper failed.

o   Recognize the brittleness of scraping and contemplate API-based alternatives.

Natural Language Processing (NLP) and parameterization

NLP transforms human language into machine-manipulable representations and uses large statistical models with billions of parameters to predict and generate text.

·        Keypoints

o   NLP is the field that digitizes language for computation.

o   Satellite orbit analogy: 6 classical Keplerian parameters (purely analytical) plus 3 extensions for drag, solar radiation pressure, and unknown thrusts (total 9) versus NLP with 2.67 billion parameters.

o   Statistical fitting in orbits uses least-squares differential correction and probabilistic predictions; NLP analogously relies on statistics at far greater scale.

·        Explanation
By comparing orbit determination to NLP, the lecturer emphasizes scale. Orbit models require nine parameters to fit observations; NLP models like large language models utilize on the order of billions (e.g., 2.67 billion) of parameters to capture nuances of grammar, meaning, and context. This vast parameterization underlies the model’s ability to ‘understand’ and generate language, but remains fundamentally statistical.

Neural networks and compute power

Neural networks consist of interconnected nodes enabling parallel computation; modern systems distribute work across thousands of nodes/CPUs, with physical interconnects becoming a limiting factor.

·        Keypoints

o   Neural network metaphor compares to a model of the human brain, with many billions of neurons.

o   Parallelization contrasts with sequential step-by-step processing.

o   Thousands of nodes are deployed; individual CPUs or computers can be assigned to nodes for distributed computation.

o   Inter-node distance and communication become bottlenecks as scale increases, limiting projected computer power.

·        Explanation
The lecturer answers why generative AI consumes substantial power: training/inference leverage parallel architectures (e.g., clusters of CPUs/GPUs/TPUs). Data flows between nodes, and synchronization/communication overheads introduce physical constraints. The ability to parallelize makes large models fast enough to be practical, but at high energy cost, with the distance between nodes being a key limiting factor.

Prompt engineering basics for generative AI

Interacting with generative AIs via natural-language prompts; specificity in instructions yields more precise outputs; author styles can be emulated within bounds.

·        Keypoints

o   Prompts are the primary interface—no code required for end user.

o   More specific prompts lead to more specific answers.

o   Author style emulation examples: love letters in styles of Jack Kerouac, F. Scott Fitzgerald, Ernest Hemingway; approximately 20 such prompts tried, yielding "entertaining" results that referenced authors' works and phrases.

o   A resident suggested an Emily Dickinson, a famous lady poet, style that yielded poetry-like output.

o   Different platforms: ChatGPT (Open AI), Copilot (Edge), Gemini (Google), Claude 4; marketing terms like “deep reasoning” are mentioned.

·        Explanation
The lecturer demonstrates by composing prompts that specify task (write a love letter) and style (named authors). The systems output text that reflects stylistic cues learned from training data. Platform selection varies by device/browser; the principle remains that clarity and constraints in prompts guide model behavior.

Python and AI tooling

Python’s simplicity and extensibility through libraries make it a de facto language for AI and data tasks; its interpretive nature and importable libraries (including math) underpin flexible development.

·        Keypoints

o   Python core syntax is simple; capabilities extended via imports.

o   Math and many domain libraries are brought in as needed, as basic math functions are not built into the core language.

o   Interpretive execution allows dynamic behavior and rapid iteration, enabling the language to modify itself while executed.

o   Library ecosystem makes capabilities seemingly unlimited.

·        Explanation
The lecturer notes that language choice (Python) is pragmatic: easy to read/write and supported by vast libraries (though specific libraries like NumPy, pandas, PyTorch, TensorFlow are not explicitly named). This allows building, training, and serving models, and scripting data workflows.

Risk management and reliability of AI outputs

AI-generated content may be incorrect; users should be cautious, especially with legal and financial decisions; AI verbosity and variability mirror human fallibility.

·        Keypoints

o   Standard disclaimers: AI-generated information may be incorrect.

o   Avoid using AI for legal or financial decisions.

o   AI can produce long expositions (e.g., the speaker prompted it to expound on why AI can be incorrect, and it "spewed out three pages").

o   Human analogy: people also provide incorrect or variable answers depending on phrasing, suggesting "a little bit of hint [of] human characteristics in this AI."

·        Explanation
The lecturer emphasizes critical thinking: treat AI outputs as drafts or aids, not authoritative sources, especially where stakes are high. Validate important information with trusted sources, acknowledging that AI's fallibility can be compared to human inconsistency.

Applied use cases for meeting, medical, and social contexts

Practical deployments of AI agents to record, transcribe, summarize, and structure information from meetings, medical visits, and casual conversations, with notable robustness in noisy environments.

·        Keypoints

o   AI agent called “PLAUD AI was adopted by the lecturer.

o   Used to record meetings in his role as vice chair, producing text summaries and structure.

o   Used to capture and summarize medical visits and needed follow-ups.

o   Applied to insurance interactions and social breakfasts, producing summaries.

o   Demonstrates selective listening in crowded restaurants, focusing on two voices and ignoring background noise; meeting audio summarized effectively despite filler words and disfluencies, making it "sound like we were so well organized."

·        Explanation
The lecturer records audio, then leverages AI transcription and summarization to surface action items and clarity. Noise robustness suggests advanced diarization and source separation. This enhances personal productivity and recall, providing "pretty interesting" summaries even from disorganized speech.

Live demo and community engagement plan

Within a one-hour Brookdale Skylines Tech Talk, the lecturer planned background coverage, AI overview, prompting demos on attendees’ iPhones, and Q&A.

·        Keypoints

o   Series: Brookdale Skylines Tech Talk.

o   Allocated time: one hour.

o   Audience was asked to try ChatGPT/Copilot/Gemini via Edge/Google or app store; GPT-4, Claude 4 also mentioned as available options.

o   Open Q&A to conclude the talk.

·        Explanation
A participatory format encourages hands-on experience with generative AI, reinforcing prompt engineering concepts and platform familiarity, with the speaker planning to introduce attendees to using these tools on their personal devices.

Music interlude and personal creativity

The lecturer’s piano hobby illustrates lifelong learning and creative practice, aiming to perform to conclude the talk.

·        Keypoints

o   His hobby is "playing the piano, or as I say, practice the piano."

o   Practices scales and improvisation in scales.

o   Planned to play "a few choruses" of his favorite tune, “St. Louis Blues,” followed by a classical piece practiced extensively (left hand, right hand, then combined).

o   Expressed curiosity about whether AI could record/interpret piano notes.

·        Explanation
The performance serves both as a personal touch and a metaphor for iterative skill-building—akin to refining prompts or models through practice—and a way to "summarize and wrap up this meeting."

Reading list for understanding and building with AI

Suggested materials range from conceptual understanding to practical development.

·        Keypoints

o   “What is ChatGPT Doing ... and Why Does It Work?” by Stephen Wolfram who invented the Mathematica language and founded his own company named Mathematica.

o   “AI Made Simple” a bookazine for a survey of products (limited on installation/cost details), which is available at www.magazinesdirect.com. ( costs $26 hard copy)

o   “ChatGPT: The API Bible” for constructing AI agents (deep programming-oriented, for those who know how to program).

·        Explanation
The resources target different depths: conceptual mechanics, user-facing overviews, and developer-level implementation. Selection depends on goals and technical comfort, offering pathways for both casual users and those interested in deep dives.

No comments:

Post a Comment