Large Language Models: Key Concepts
2024-01-01
Tokenization example showing how text is processed
Note
Larger context = Better understanding but higher computational cost
Key Breakthrough: “Attention is All You Need” (2017)
Vaswani et al. (2017). “Attention Is All You Need”. NeurIPS 30.
The Centaur and Cyborg Approaches based on Co-Intelligence: Living and Working with AI By Ethan Mollick
Co-Intelligence
Image created Claude.ai
Image created in detailed photorealistic style by Ralph Losey with ChatGPT4 Visual Muse version
Image created in detailed photorealistic style by Ralph Losey with ChatGPT4 Visual Muse version
Planning 👤 Design research plan
🤖 Suggest variables
Data Prep 👤 Define cleaning rules
🤖 Execute cleaning code
Analysis 👤 Choose methods
🤖 Implement code
👤 Validate results
Reporting 👤 Outline findings
🤖 Draft sections
👤 Finalize
Planning 👤🤖 Interactive brainstorming
👤🤖 Collaborative refinement
Data Prep 👤🤖 Iterative cleaning
👤🤖 Real-time modification
👤🤖 Joint discovery
Analysis 👤🤖 Exploratory conversation
👤🤖 Dynamic adjustment
👤🤖 Continuous validation
Reporting 👤🤖 Co-writing process
👤🤖 Real-time feedback
👤🤖 Iterative improvement
Prompt as small task you got as RA
In order to get a highly relevant response, make sure that requests provide any important details or context. Otherwise you are leaving it up to the model to guess what you mean. 📍 more later
Image created in detailed photorealistic style by Ralph Losey with ChatGPT4 Visual Muse version
Data Analysis with AI - 2024