How to Think About Algorithms

How to Think About Algorithms

作者:Jeff Edmonds

出版社:Cambridge University Press

出版年:2008-05-19

评分:8.7

ISBN:9780521614108

所属分类:行业好书

书刊介绍

内容简介

HOW TO THINK ABOUT ALGORITHMS

There are many algorithm texts that provide lots of well-polished code and

proofs of correctness. Instead, this one presents insights, notations, and

analogies to help the novice describe and think about algorithms like an

expert. It is a bit like a carpenter studying hammers instead of houses. Jeff

Edmonds provides both the big picture and easy step-by-step methods for

developing algorithms, while avoiding the comon pitfalls. Paradigms such

as loop invariants and recursion help to unify a huge range of algorithms

into a few meta-algorithms. Part of the goal is to teach students to think

abstractly. Without getting bogged down in formal proofs, the book fosters

deeper understanding so that how and why each algorithm works is trans-

parent. These insights are presented in a slow and clear manner accessible

to second- or third-year students of computer science, preparing them to

find on their own innovative ways to solve problems.

Abstraction is when you translate the equations, the rules, and the under-

lying essences of the problem not only into a language that can be commu-

nicated to your friend standing with you on a streetcar, but also into a form

that can percolate down and dwell in your subconscious. Because, remem-

ber, it is your subconscious that makes the miraculous leaps of inspiration,

not your plodding perspiration and not your cocky logic. And remember,

unlike you, your subconscious does not understand Java code.

Bookmarks

Cover

Half-title

Title

Copyright

CONTENTS

PREFACE

Introduction

PART ONE:Iterative Algorithms and Loop Invariants

1 Iterative Algorithms: Measures of Progress and Loop Invariants

1.1 A Paradigm Shift: A Sequence of Actions vs. a Sequence of Assertions

1.2 The Steps to Develop an Iterative Algorithm

1.3 More about the Steps

1.4 Different Types of Iterative Algorithms

1.5 Typical Errors

1.6 Exercises

2 Examples Using More-of-the-Input Loop Invariants

2.1 Coloring the Plane

2.2 Deterministic Finite Automaton

2.3 More of the Input vs. More of the Output

3 Abstract Data Types

3.1 Specifications and Hints at Implementations

3.2 Link List Implementation

3.3 Merging with a Queue

3.4 Parsing with a Stack

4 Narrowing the Search Space: Binary Search

4.1 Binary Search Trees

4.2 Magic Sevens

4.3 VLSI Chip Testing

4.4 Exercises

5 Iterative Sorting Algorithms

5.1 Bucket Sort by Hand

5.2 Counting Sort (a Stable Sort)

5.3 Radix Sort

5.4 Radix Counting Sort

6 Euclid’s GCD Algorithm

7 The Loop Invariant for Lower Bounds

PART TWO: Recursion

8 Abstractions, Techniques, and Theory

8.1 Thinking about Recursion

8.2 Looking Forward vs. Backward

8.3 With a Little Help from Your Friends

8.4 The Towers of Hanoi

8.5 Checklist for Recursive Algorithms

8.6 The Stack Frame

8.7 Proving Correctness with Strong Induction

9 Some Simple Examples of Recursive Algorithms

9.1 Sorting and Selecting Algorithms

9.2 Operations on Integers

9.3 Ackermann's Function

9.4 Exercises

10 Recursion on Trees

10.1 Tree Traversals

10.2 Simple Examples

10.3 Generalizing the Problem Solved

10.4 Heap Sort and Priority Queues

10.5 Representing Expressions with Trees

11 Recursive Images

11.1 Drawing a Recursive Image from a Fixed Recursive and a Base Case Image

11.2 Randomly Generating a Maze

12 Parsing with Context-Free Grammars

PART THREE: Optimization Problems

13 Definition of Optimization Problems

14 Graph Search Algorithms

14.1 A Generic Search Algorithm

14.2 Breadth-First Search for Shortest Paths

14.3 Dijkstra's Shortest-Weighted-Path Algorithm

14.4 Depth-First Search

14.5 Recursive Depth-First Search

14.6 Linear Ordering of a Partial Order

14.7 Exercise

15 Network Flows and Linear Programming

15.1 A Hill-Climbing Algorithm with a Small Local Maximum

15.2 The Primal…Dual Hill-Climbing Method

15.3 The Steepest-Ascent Hill-Climbing Algorithm

15.4 Linear Programming

15.5 Exercises

16 Greedy Algorithms

16.1 Abstractions, Techniques, and Theory

16.2 Examples of Greedy Algorithms 16.2.1 Example: The Job/Event Scheduling Problem

16.2.2 Example: The Interval Cover Problem

16.2.3 Example: The Minimum-Spanning-Tree Problem

16.3 Exercises

17 Recursive Backtracking

17.1 Recursive Backtracking Algorithms

17.2 The Steps in Developing a Recursive Backtracking

17.3 Pruning Branches

17.4 Satisfiability

17.5 Exercises

18 Dynamic Programming Algorithms

18.1 Start by Developing a Recursive Backtracking

18.2 The Steps in Developing a Dynamic Programming Algorithm

18.3 Subtle Points

18.3.1 The Question for the Little Bird

18.3.2 Subinstances and Subsolutions

18.3.3 The Set of Subinstances

18.3.4 Decreasing Time and Space

18.3.5 Counting the Number of Solutions

18.3.6 The New Code

19 Examples of Dynamic Programs

19.1 The Longest-Common-Subsequence Problem

19.2 Dynamic Programs as More-of-the-Input Iterative Loop Invariant Algorithms

19.3 A Greedy Dynamic Program: The Weighted Job/Event Scheduling Problem

19.4 The Solution Viewed as a Tree: Chains of Matrix Multiplications

19.5 Generalizing the Problem Solved: Best AVL Tree

19.6 All Pairs Using Matrix Multiplication

19.7 Parsing with Context-Free Grammars

19.8 Designing Dynamic Programming Algorithms via Reductions

20 Reductions and NP-Completeness

20.1 Satisfiability Is at Least as Hard as Any Optimization Problem

20.2 Steps to Prove NP-Completeness

20.3 Example: 3-Coloring Is NP-Complete

20.4 An Algorithm for Bipartite Matching Using the Network Flow Algorithm

21 Randomized Algorithms

21.1 Using Randomness to Hide the Worst Cases

21.2 Solutions of Optimization Problems with a Random Structure

PART FOUR: Appendix

22 Existential and Universal Quantifiers

23 Time Complexity

23.1 The Time (and Space) Complexity of an Algorithm

23.2 The Time Complexity of a Computational Problem

24 Logarithms and Exponentials

25 Asymptotic Growth

25.1 Steps to Classify a Function

25.2 More about Asymptotic Notation

26 Adding-Made-Easy Approximations

26.1 The Technique

26.2 Some Proofs for the Adding-Made-Easy Technique

27 Recurrence Relations

27.1 The Technique

27.2 Some Proofs

28 A Formal Proof of Correctness

PART FIVE: Exercise Solutions

Chapter 1. Iterative Algorithms: Measures of Progress and Loop Invariants

Chapter 2. Examples UsingMore-of-the-Input Loop Invariant

Chapter 3. Abstract Data Types

Chapter 4. Narrowing the Search Space: Binary Search

Chapter 6. Euclid’s GCD Algorithm

Chapter 7. The Loop Invariant for Lower Bounds

Chapter 8. Abstractions, Techniques, and Theory

Chapter 9. Some Simple Examples of Recursive Algorithms

Chapter 10. Recursion on Trees

Chapter 11. Recursive Images

Chapter 12. Parsingwith Context-Free Grammars

Chapter 14. Graph Search Algorithms

Chapter 15. Network Flows and Linear Programming

Chapter 16: Greedy Algorithms

Chapter 17. Recursive Backtracking

Chapter 18. Dynamic Programming Algorithms

Chapter 19. Examples of Dynamic Programs

Chapter 20. Reductions and NP-Completeness

Chapter 22. Existential and Universal Quantifiers

Chapter 23. Time Complexity

Chapter 24. Logarithms and Exponentials

Chapter 25. Asymptotic Growth

Chapter 26. Adding-Made-Easy Approximations

Chapter 27. Recurrence Relations

CONCLUSION

INDEX

作者简介

Jeff Edmonds received his Ph.D. in 1992 at University of Toronto in theoretical computer science. His thesis proved that certain computation problems require a given amount of time and space. He did his postdoctorate work at the ICSI in Berkeley on secure multi-media data transmission and in 1995 became an Associate Professor in the Department of Computer Science at York Univer...

(展开全部)

相关推荐

微信二维码