What is Dartmouth workshop in AI?

by Stephen M. Walker II, Co-Founder / CEO

What was the Dartmouth Workshop?

The Dartmouth Workshop, officially known as the Dartmouth Summer Research Project on Artificial Intelligence, was a seminal event in the history of artificial intelligence (AI). It took place in 1956 at Dartmouth College in Hanover, New Hampshire, and is widely considered the founding event of AI as a distinct field of study.

The workshop was organized by John McCarthy, Marvin Minsky, Nathaniel Rochester, and Claude Shannon. The project lasted approximately six to eight weeks and was essentially an extended brainstorming session. The organizers based the workshop on the conjecture that "every aspect of learning or any other feature of intelligence can in principle be so precisely described that a machine can be made to simulate it".

The workshop brought together a diverse group of participants from various backgrounds, including mathematics, psychology, and electrical engineering. The attendees shared a common belief that the act of thinking is not unique to humans or even to biological entities. The discussions during the workshop were wide-ranging and helped seed ideas for future AI research.

Despite the optimism and high expectations, the workshop did not lead to immediate breakthroughs towards human-level AI. However, it did inspire many people to pursue AI goals in their own ways. The term "artificial intelligence" was coined during this period, and the Dartmouth Workshop played a significant role in establishing AI as a legitimate field of research.

More terms

What is a knowledge-based system?

A knowledge-based system (KBS) is a form of artificial intelligence (AI) that uses a knowledge base to solve complex problems. It's designed to capture the knowledge of human experts to support decision-making. The system is composed of two main components: a knowledge base and an inference engine.

Read more

What is Backpropagation through time (BPTT)

Backpropagation through time (BPTT) is a method for training recurrent neural networks (RNNs), which are designed to process sequences of data by maintaining a 'memory' of previous inputs through internal states. BPTT extends the concept of backpropagation used in feedforward networks to RNNs by taking into account the temporal sequence of data.

Read more

It's time to build

Collaborate with your team on reliable Generative AI features.
Want expert guidance? Book a 1:1 onboarding session from your dashboard.

Start for free