An Incremental Approach To Generative NC
This Air Force production facility is automating NC programming to dramatically reduce the time required to model and manufacture emergency replacement parts. Their "virtual factory" approach models part and process concurrently.
Amid the sprawling Tinker Air Force Base in Oklahoma City is one of the nation's largest military aircraft maintenance and repair facilities. The center is made up of several Product Directorates that are responsible for the management of such weapon systems as the B-1B, B-2, KC-135, E-3, Jet Engines, and Commodities. One Directorate is responsible to bring in new technologies to help their fellow directorates. In the finest military lexicon, it is known as the Technology and Industrial Support Directorate of the Oklahoma City Logistics Center--or simply TI for short. Inside the Commodities Directorate (LI) and TI is a highly sophisticated engineering, testing and manufacturing operation capable of rebuilding jet engines as well as many structural components of military airplanes.
Part of TI's job is inserting new technology to assist LI in making emergency replacements for unexpectedly failed aircraft parts. Say, for example, an Air Force technician in Alaska notices a cracked bracket on the landing gear of a C-130 transport plane. The part is removed and sent to Tinker where it undergoes failure analysis and re-engineering. A new replacement part is then made. Time is of the essence because, not only is this one plane inoperable, the entire fleet may be grounded until LI gets to the heart of the problem and manufactures enough parts to remedy the immediate situation.
For many years, Tinker's staff has been good at machining these often complicated parts on three-, four- and five-axis machining centers. But in more recent times their NC programming capabilities had grown outdated. They started with very basic systems, including APT language programming. In the middle 1980s, they moved to a then-sophisticated CAD/CAM system, which came to be woefully inadequate by 1990s standards. Three years ago TI made a major step forward with the addition of a solid modeling CAD system that helped compress the engineering cycle. NC programming, however, was helped very little, since that function was still accounting for the vast majority of production time. Now TI is taking the final step by automating programming with a generative NC CAM system capable of reading a part feature, such as a hole, and automatically gen-erating the process and NC code necessary to manufacture the hole.
TI hasn't automated all such feature-making processes and probably never will. With an incremental approach, taking it feature by feature, they are already reaping large reductions in part-programming time. And as their CAM system continues to acquire more knowledge, they open an opportunity to add extraordinary new capabilities for modeling part and process, both for parts they make themselves and for those made in other facilities. Here's how they are approaching generative NC at Tinker.
The Core
According to Ed Kincaid, team leader of technology insertion at Tinker, the new generation of their design and manufacturing system began in 1992 when TI replaced the old CAD/CAM system with solid modeling CAD/CAM/CAE software, the I-DEAS Master Series from SDRC (now part of UGS Corporation). There are several critical points to the concept behind this particular system as it applies to the requirements of TI's mission:
First, it is feature-based, meaning that a solid model can be defined in terms of the part's features--through-hole, slot, boss, rib, pocket and so on--rather than having to build these elements with individual points, lines, arcs, curves, or solid primitives such as cylinders, spheres, cones and such. This technique simplifies and speeds the process of creating geometry but more important from an NC programming standpoint, allows manufacturing requirements such as tolerance and surface finish to be associated with each feature. More on this later.
Second, the construction of geometry in CAD is based on the variational modeling technique. Variational modeling is essentially similar to the more commonly recognized term "parametric modeling" where features and the relationships between features can be defined independently of fixed dimensions. For example, one wall of a model can be defined as parallel or perpendicular to another, or the ends of a cylinder (that is in fact a hole) can be defined as lying on the opposite walls of the solid. This way, the functional intent of various features can be established rather easily, and then the model can be edited without worry of disturbing these critical relationships. Stock thickness can be altered, for example, without having to individually edit the hole cylinder, or perhaps the entire model can be scaled and rescaled without having to reconstruct any geometry at all. Such capabilities are particularly useful when models are frequently altered as a result of finite element analysis or prototype testing.
And third, the system is fully integrated and associative, meaning that once a model is created in CAD it can be easily passed on to other engineering and manufacturing applications so that these functions can proceed concurrently, which is particularly important in light of TI's quick-response objectives. Moreover, each function works directly from the same model--rather than translating models into each application--so that any necessary design changes automatically propagate throughout the system.
Shortly after implementing the system, TI had an opportunity to test its effectiveness. A B-1B nose landing gear failed to lock in an uplock position during flight. A subsequent inspection revealed visible cracks in the bellcrank component (a hinge of sorts) of the locking mechanism. In just two days LI/TI was able to capture the part configuration, run a finite element analysis, and then modify the structure to reinforce several high-stress areas. Programming the part, however, was quite another matter. In all it took 37 days to move from design to proved NC program. To be fair, some two weeks of that time was consumed in gaining approval for the design modifications. Still, they believed the "art-to-part" cycle was simply too long to live up to TI's goals of timely production.
Enter GNC
What Mr. Kincaid believed they needed was a way to automate the NC programming function, which was no easy task considering the geometric complexity of many of the parts TI processes. But the timing was good. SDRC had spent a great deal of development effort on a new generative NC module and was anxious to prove it in a demanding beta test environment. Tinker was just the place, and SDRC offered their help in capturing the shop's preferred methods in the manufacturing system.
Generative NC programming--which is also sometimes referred to as "knowledge-based" programming--is a technology in which process planning expertise is captured in a database and then applied to automatically generate a part program. That knowledge can include equipment selections, machining strategies, raw material stocks, tooling selections, feeds and speeds, and many more variables. In operation, the idea is to code all the decisions a programmer would normally work out interactively in CAM into a complex set of rules that are applied to analyze the workpiece requirements, and then select the appropriate process steps to actually machine the part. For example: The part is round, so it goes to a lathe. The OD is 17/8 inches, so it will be cut from 2-inch bar stock. A total of 1/8 inch of stock is to be removed, so it will require one roughing pass and one finishing pass. All these decisions are coded in the form of if-then statements--if stock to be removed is between 0.1 and 0.2 inch, then use one roughing pass at a DOC of total cut depth minus 0.01 inch and one finish pass at 0.01-inch DOC. Once a rule base is established that covers all the conditions for a workpiece to be processed, the part program can be processed automatically.
Or at least that's the theory. The practice is more a matter of fine and numerous details. The heart of generative NC really lies at the feature level, which is precisely why a feature-based CAD model is so critical to this technology. The idea is for the system to recognize a standard feature--say, a hole--and then apply a generic process for creating that hole, plugging in the variables of the case at hand, including the specific dimensions, the workpiece material, and, very importantly, the required tolerances--all of which are associated with the feature in the solid model. If the hole tolerances are loose, it may call for a simple drilling operation, drawing from a library of tools in the system database, and applying appropriate feeds and speeds from tables also included in the database. If hole location is critical, it may put a spot-drilling operation into the routine. A tighter bore tolerance may trigger a boring or reaming operation. An even tighter tolerance may signal the need for a secondary grinding operation, and thus the drilling and boring operations are set to leave appropriate amounts of stock for the subsequent operations. In any event, once a feature is recognized by the system, its machining process is constructed, and then output as completed NC tool path code.
While this programming methodology is manageable enough at the individual feature level, it quickly gets exceedingly complicated when one begins to incorporate the many feature and process variables present in an average metalworking enterprise. Capturing all contingencies in a master set of rules is a huge undertaking, an issue that no doubt has been the largest obstacle to broader acceptance of generative NC technology.
Given the complicated nature of the work at Tinker, it was no small concern there either, so rather than trying to automate all their programming tasks, they took it a feature at a time, starting first with the most common. The generative NC technique would be mixed with conventional interactive programming, both of which are conducted on the same CAM system. The idea was to target the most repetitive programming tasks, which account for the largest portions of programming time. While purists might not like the approach, Mr. Kincaid is very pleased with the practical benefits of saving a great deal of time already, yet not being hampered in day-to-day productivity while the new technology phases in. While TI has just captured ten feature processes of the 29 planned, they've already affected a 30-to-80 percent decrease in programming time over their prior method, depending on the part. With further automation, he hopes to achieve a 90-percent overall reduction.
The Big Picture
Besides faster programming, it is important to recognize how the integration of design, engineering and manufacturing systems contributes to Tinker's overall objective of total manufacturing cycle time reduction. In days past, manufacturing was sequential: from design, to engineering, to NC programming, to prototype, and finally on to production. If more changes were necessary, much of that development process had to be repeated in sequence.
With the new process, however, much of the product and process development can proceed concurrently. While the engineering work is being done, manufacturing can go ahead and process the part, select and schedule necessary manufacturing resources, and move on to generate tool-path code. And should changes be required as a result of the TI's own finite element analysis, or from requests by outside project team members, it's a relatively simple matter to edit the part's original solid model.
Perhaps another case study better illustrates the results. By the middle of 1994, Tinker had made considerable progress in honing their operational skills with the system and in capturing manufacturing processes. A real test of those capabilities came in June with the failure of an aileron bracket on C-130 transport plane. In a subsequent inspection of the fleet, cracks were found on nearly half the brackets, with potential of grounding aircraft if the supply of brackets was not maintained. By LI's old methods it was estimated that it would take four weeks to deliver replacement parts, which was clearly an unacceptable resolution.
This time TI would employ the new technology. The initial design/manufacturing model was created in a day. The replacement parts would be machined from solid blocks of aluminum, and so it took another two days to create the initial tool paths. Meanwhile, analysis was performed to identify high stress points, and some modifications were made to improve the part strength. Within five days from the start of the project, they were machining a prototype part. By the eleventh day, they were in production. Six days after that, Tinker delivered mission-ready parts. In total, more than two weeks had been taken out of the anticipated design, engineering and programming cycle, and that included a post-prototype material switch to stainless steel.
The Really Big Picture
What the burgeoning CAD/CAM/CAE capabilities at Tinker are really about, says Mr. Kincaid, is the creation of what they call the "Virtual Factory." That is, through their ability to quickly model the workpiece geometry, its performance under load, and ultimately the entire manufacturing process by which it will be made, LI will eventually be able to get parts into production in a matter of hours where it once took weeks. And that may even apply to workpieces physically machined in other facilities that could be located anywhere in the world.
Too good to be true? Perhaps, but consider the system they are creating. When a feature-based model is first created in CAD--or as they say at Tinker, it is created in the product/process design database--it includes virtually all the information that will be required for manufacturing the part. Besides geometry and tolerances, it can include raw material type and configuration, mating part relationships and assembly requirements, loads and constraints, and more. Also in the system database, they capture their best practice process knowledge. That includes rules to define proper methods planning and tables of appropriate process parameters. And they capture "site knowledge," meaning awareness of available machine tools and their performance characteristics, workholding technique, and even expertise of the operation. Not only can a primary manufacturing site be included, but also remote sites. So part and process can be modeled in Oklahoma City, and then transferred to the remote site.
TI has already begun testing this concept with various sites in the Oklahoma state vocational education network. Once it is proven, Mr. Kincaid would like to see commercial parts suppliers brought into TI's site knowledge database. If that happens, it will enable TI to transfer some of their own best practices to the private sector and also to provide the opportunity to learn better methods from external sources as well.
That's all in the future, of course, and perhaps a bit speculative. But even if it never happens, generative NC has nonetheless delivered this parts-making operation to more timely and productive performance.
Related Content
7 CNC Parameters You Should Know
Parameters tell the CNC every little detail about the specific machine tool being used, and how all CNC features and functions are to be utilized.
Read MoreCNC-Related Features of Custom Macro
CNC-related features of custom macro are separated into two topics: system variables and user-defined G and M codes. This column explores both.
Read MoreHow this Job Shop Grew Capacity Without Expanding Footprint
This shop relies on digital solutions to grow their manufacturing business. With this approach, W.A. Pfeiffer has achieved seamless end-to-end connectivity, shorter lead times and increased throughput.
Read MoreSwiss-Type Control Uses CNC Data to Improve Efficiency
Advanced controls for Swiss-type CNC lathes uses machine data to prevent tool collisions, saving setup time and scrap costs.
Read MoreRead Next
Increasing Productivity with Digitalization and AI
Job shops are implementing automation and digitalization into workflows to eliminate set up time and increase repeatability in production.
Read MoreIMTS 2024: Trends & Takeaways From the Modern Machine Shop Editorial Team
The Modern Machine Shop editorial team highlights their takeaways from IMTS 2024 in a video recap.
Read MoreThe Future of High Feed Milling in Modern Manufacturing
Achieve higher metal removal rates and enhanced predictability with ISCAR’s advanced high-feed milling tools — optimized for today’s competitive global market.
Read More