Computer-Aided Manufacturing (CAM) describes computers and computer technology to assist in all phases of manufacturing, including process and production planning schedule, manufacture, quality control and management. Historically, CAM technology was sparked by the invention of NC (Numerical Control) machine tools developed to manufacture complex shapes accurately. Internationally, NC machines are directed by part programs following industrial data standard, RS274D, known as ISO 6983. The standard defines a set of ‘M’ and ‘G’ codes that specify a sequence of cutting tool movements and the direction of rotation, the speed of travel, and various auxiliary functions, such as coolant flow.
The first generation of Computer-Aided Manufacturing (CAM) emerged when Automatically Programmed Tool (APT) was developed to help control NC machines at the Massachusetts Institute of Technology (MIT) in the 1950s. APT is a universal programming language for NC machines and has been widely adopted. APT provides a convenient way to define geometry elements and generate cutter locations for NC programs by computers. APT was created before graphical interfaces were available, so it relies on text to specify the geometry and tool paths needed to machine apart. This poses a significant potential for errors in defining comprehensive geometries and tool positioning commands. This problem was overcome by introducing graphics-based CAM in the 1980s, allowing part geometry to be described in the form of points, lines, arcs and so on, rather than requiring a translation to a text-oriented notation.