Ash Space
GitHubHuggingFace
  • Contents
  • 🥑Resume / CV
    • Reseume / CV
  • 📄Paper Review
    • Paper List
      • [2017] Attention is all you need
      • [2023] CoVe : Chain of Verification Reduces Hallucination in Large Language Models
      • [2024] RAG Survey : A Survey on Retrieval-Augmented Text Generation for Large Language Models
      • [2023] Interleaving Retrieval with Chain-of-Thought for Knowledge-Intensive Multi-Step Questions
      • [2024] Take a Step Back: Evoking Reasoning via Abstraction in Large Language Models
      • [2020] ColBERT: Efficient and Effective Passage Search via Contextualized Late Interaction over BERT
      • [2024] Retrieval Augmented Generation (RAG) and Beyond
      • [2009] Reciprocal Rank Fusion outperforms Condorcet and individual Rank Learning Methods
      • [2024] Don't Do RAG : When Cache-Augmented Generation is All you Need for Knowledge Tasks
      • [2024] Text2SQL is Not Enough : Unifying AI and Database with TAG
  • 🗂️Research Article
    • Reference List
      • Dataset
      • LLM
      • Prompt Engineering
      • LLMops
      • RAG & Agent
      • Etc
    • Compounded AI System : The Shift from Models to Compound AI Systems
    • LLM과 Grounding
    • Essence of RAG
    • How to reduce Hallucinations
    • Golden Gate Claude Review
    • Editorial Thinking
    • Embedding을 평가하는 방법
    • 나야, Chunk
    • 당신.. Chunking이 뭔지 정확히 알아..?
    • 그래서 제일 좋은 Chunking이 뭔데?
    • 웅장한 대결 AI Agent와 Agentic AI
    • UV써도 괜찮아~ 딩딩딩딩딩
    • 아무도 RAG 평가 셋 만드는 것에 관심가지지 않아~
    • Linguistic Prompts
    • Chroma야, Chunking 평가를 어떻게 한다고?
    • Generations Never Easy
    • Model Context Protocol
    • Chill칠치 못한 Function Calling
    • RAG 평가지표 정복하기
    • LLM Quantization 방법론 알아보기
    • LLM은 더우면 헛소리를 해?
    • Text2SQL 넌 내꺼야!
  • 🏵️Conference
    • 일할맛 판교 3월 세미나
    • LangChainOpenTutorial를 진행하며
    • Talk: Prompt and Language The Science of Prompts 후기
    • 2024년 회고
    • 제 7회 Kako Tech Meet Up 후기
    • Moducon 2023 행사 후기
    • GDGXGDSC DevFest Happy Career 행사 후기
    • 모두를 위한 한국어 오픈액세스 언어모델 못다한 이야기 (feat. 모두연) #1
    • 모두를 위한 한국어 오픈액세스 언어모델 못다한 이야기 (feat. 모두연) #2
    • 맨땅에서 구축해본 개인화시스템 구축기 Session 후기
  • ♟️Basic
    • 00 Introduction
    • 01-1 LLM 지도
    • 01-2 LLM의 중추 트랜스포머 아키텍처 살펴보기
Powered by GitBook
On this page
  1. Research Article
  2. Reference List

Prompt Engineering

다양한 프롬프트 템플릿 및 엔지니어링 기법에 대해 수집합니다.

Last updated 5 months ago

  • Guide book

  • Basic methods

    • zero-shot

    • few-shot

  • Chain of Thought

  • Self-Consistency

  • Generate Knowledge Prompting

  • Prompt Chaining

  • Tree of Thought

  • Automatic Reasoning and Tool Use

  • Automatic Prompt Engineer (APE)

  • Active Prompt

  • Directional Stimulus Prompting

  • Program Aided Language Model (PAL)

  • ReAct

  • Reflexion

  • Multimodal CoT

  • Adversarial Prompting

  • Factuality Prompting

  • Reference

    • Prompt Layer

    • Awesome Chatgpt Prompts

    • Prompt Flow

https://github.com/dair-ai/Prompt-Engineering-Guide
https://www.promptingguide.ai/kr
https://www.promptingguide.ai/kr/techniques/zeroshot
https://www.promptingguide.ai/kr/techniques/fewshot
https://www.promptingguide.ai/kr/techniques/cot
https://www.promptingguide.ai/kr/techniques/consistency
https://www.promptingguide.ai/kr/techniques/knowledge
https://www.promptingguide.ai/kr/techniques/prompt_chaining
https://www.promptingguide.ai/kr/techniques/cot
https://www.promptingguide.ai/kr/techniques/art
https://www.promptingguide.ai/kr/techniques/ape
https://www.promptingguide.ai/kr/techniques/activeprompt
https://www.promptingguide.ai/kr/techniques/dsp
https://www.promptingguide.ai/kr/techniques/pal
https://www.promptingguide.ai/kr/techniques/react
https://www.promptingguide.ai/kr/techniques/reflexion
https://www.promptingguide.ai/kr/techniques/multimodalcot
https://www.promptingguide.ai/kr/risks/adversarial
https://www.promptingguide.ai/kr/risks/factuality
https://docs.anthropic.com/claude/prompt-library
https://docs.promptlayer.com/introduction
https://github.com/f/awesome-chatgpt-prompts
https://github.com/microsoft/promptflow
🗂️
Page cover image