🛠️

Techniques for Effective Prompt Engineering

Apr 10, 2025

Lecture Notes: Prompt Engineering

Presented by: Lee Boonstra

Acknowledgements

  • Content Contributors: Michael Sherman, Yuan Cao, Erick Armbrust, Anant Nawalgaria, Antonio Gulli, Simone Cammel
  • Curators and Editors: Antonio Gulli, Anant Nawalgaria, Grace Mollison
  • Technical Writer: Joey Haymaker
  • Designer: Michael Lanning

Introduction to Prompt Engineering

  • Definition: A prompt is the input a large language model (LLM) uses to generate a specific output.
  • Goal: Crafting effective prompts that guide LLMs in producing accurate outputs.
  • Challenges: Ambiguous prompts can lead to inaccurate responses.

LLM Output Configuration

  • Key Configurations:
    • Output Length: Affects computation, response time, and costs.
    • Sampling Controls: Temperature, Top-K, and Top-P.

Temperature

  • Controls randomness in token selection.
  • Low temperature = deterministic responses; High temperature = diverse outputs.

Top-K and Top-P

  • Top-K: Limits selection to the top K probable tokens.
  • Top-P: Limits selection to tokens within a probability threshold P.

Prompting Techniques

General Prompting / Zero-Shot

  • Simplest form, no examples provided, just a task description.

One-Shot & Few-Shot

  • One-Shot: Provides a single example.
  • Few-Shot: Provides multiple examples to establish a pattern.

System, Contextual, and Role Prompting

  • System Prompting: Sets overall task context.
  • Contextual Prompting: Offers task-specific details.
  • Role Prompting: Assigns a character or identity to align outputs with specific roles.

Step-Back Prompting

  • Encourages broader reasoning before specific task execution.

Chain of Thought (CoT)

  • Generates intermediate reasoning steps to enhance LLM reasoning capabilities.

Self-Consistency

  • Uses sampling and majority voting to establish consistent answers.

Tree of Thoughts (ToT)

  • Explores multiple reasoning paths simultaneously for complex tasks.

ReAct (Reason & Act)

  • Combines reasoning and external tool interaction for problem-solving.

Automatic Prompt Engineering

  • Automates prompt generation and refinement to enhance model performance.

Code Prompting

  • Includes writing, explaining, translating, and debugging code using LLMs.

Best Practices

  • Provide Examples: Use one-shot or few-shot examples.
  • Design Simplicity: Keep prompts clear and concise.
  • Specificity: Be specific about the desired output.
  • Use Instructions Over Constraints: Provide positive instructions rather than limitations.
  • Control Max Token Length: Essential for output management.
  • Use Variables: To make prompts dynamic.
  • Experiment: Test different input formats and styles.

Final Notes

  • Documentation: Keep detailed records of prompt iterations and outcomes.
  • Adaptation: Stay updated with model changes and adapt prompts accordingly.

Summary

  • Discussed various prompting techniques and best practices to enhance LLM interaction and output accuracy.