Algorithms that have to process large data sets have to take into account that the cost of memory access depends on where the data is stored. Traditional algorithm design is based on the von Neumann model where accesses to memory have uniform cost. Actual machines increasingly deviate from this model: while waiting for memory access, nowadays, microprocessors can in principle execute 1000 additions of registers; for hard disk access this factor can...
Related Subjects
Algorithms Artificial Intelligence Computer Science Computers Computers & Technology Education & Reference Human Vision & Language Systems Medical Medical Books Modeling & Simulation Network Administration Networking Operating Systems Programming Software Design, Testing & Engineering Software Development Structured Design Theory of Computing