Skip to main content

Menu

LEVEL 0
0/5 XP
HomeAboutTopicsPricingMy VaultStats

Categories

🤖 Artificial Intelligence
☁️ Cloud and Infrastructure
💾 Data and Databases
💼 Professional Skills
🎯 Programming and Development
🔒 Security and Networking
📚 Specialized Topics
HomeAboutTopicsPricingMy VaultStats
LEVEL 0
0/5 XP
GitHub
© 2026 CheatGrid™. All rights reserved.
Privacy PolicyTerms of UseAboutContact

Concurrency Models and Patterns Across Languages Cheat Sheet

Concurrency Models and Patterns Across Languages Cheat Sheet

Back to Programming Languages
Updated 2026-05-16
Next Topic: Functional Programming Cheat Sheet

Concurrency models define how programs handle multiple tasks simultaneously, whether through parallel execution on multiple cores or interleaved execution on a single core. Different programming languages adopt fundamentally different approaches — message passing (Go, Erlang), shared memory (Java, C++), async/await (JavaScript, Python), and ownership-based (Rust) — each optimized for different classes of problems. The landscape spans from low-level atomic operations and memory ordering to high-level abstractions like actors and CSP channels. Understanding these models is essential because the choice shapes performance, safety, and complexity: a lock-based approach may deadlock where a message-passing design cannot, and an async runtime may outperform threads for I/O-bound work while underperforming for CPU-bound tasks.

What This Cheat Sheet Covers

This topic spans 12 focused tables and 89 indexed concepts. Below is a complete table-by-table outline of this topic, spanning foundational concepts through advanced details.

Table 1: Fundamental Concurrency ModelsTable 2: Language-Specific Threading ModelsTable 3: Synchronization PrimitivesTable 4: Atomic Operations and Memory OrderingTable 5: Lock-Free and Wait-Free ProgrammingTable 6: Concurrency PatternsTable 7: Deadlock and LivelockTable 8: Async Runtimes and Event LoopsTable 9: Concurrency Bugs and TestingTable 10: Context Switching and PerformanceTable 11: Message Passing vs Shared MemoryTable 12: Concurrency Best Practices

Table 1: Fundamental Concurrency Models

ModelExampleDescription
Threads and processes
Thread t = new Thread(() -> doWork());
t.start();
• OS-managed units of execution
• threads share memory, processes are isolated
• threads have lower overhead but require synchronization
Actor model
case class Message(data: String)
actor ! Message("hello")
• Isolated actors communicate via asynchronous messages
• each actor processes messages sequentially
• popular in Erlang/Elixir/Akka
• avoids shared state
CSP (Communicating Sequential Processes)
ch := make(chan int)
ch <- 42
• Channels for message passing between goroutines
• anonymous communication (vs. actor identities)
• core abstraction in Go and Clojure core.async
Async/await and coroutines
async def fetch():
await asyncio.sleep(1)
• Cooperative multitasking using async functions that yield at await points
• single-threaded event loop schedules coroutines
• ideal for I/O-bound workloads
Green threads / fibers
Fiber.new { sleep 1 }.resume
• User-space threads scheduled by runtime (not OS)
• lower context-switch overhead
• Ruby fibers and pre-1.2 Go goroutines are examples

More in Programming Languages

  • Clojure Programming Language Cheat Sheet
  • Functional Programming Cheat Sheet
  • Arrays & Strings Cheat Sheet
  • Julia Programming Language Cheat Sheet
  • Python Libraries Cheat Sheet
  • TOML Configuration Format Cheat Sheet
View all 31 topics in Programming Languages