OPENING OF THE 2020/21 ACADEMIC YEAR
"Past, present and future of analysis and design of Structures"
Eduardo Bayo Pérez
Dr. Ingeniero de Caminos Full Professor of Mechanics of Continuous Media and Theory of Structures
School of Architecture University of Navarra
Your Excellency President Magnificent
Your Excellencies Authorities
faculty and students
Ladies and Gentlemen
Every time we cross a bridge or the crest of a dam, enter a building, ride in an automobile, fly in an airplane, navigate a ship, access an industrial or agricultural building, or pass a forestry bridge, we are using and depending on a structure. This wide variety of applications makes the subjects of strength of materials and theory of Structures part of the academic curriculum of most engineering and architecture.
The structural design starts from ancient times following an inductive process; that is, from the data coming from Structures and from the accumulated experience, it is learned and generalized for other cases. It was known what to do (and how to do it) so that a structure would not fail, but it was not known why. However, it is logical to think that some rules had to be transmitted between generations, as shown in the Hammurabi code where the figure of the architect already appears and where it is established how constructions should be built based on the aforementioned rules.
The use of scientific principles to design of structural and mechanical systems begins with the Greek civilization and it is when the first relations between mathematics, mechanics and Structures appear, and from this moment on these subjects will remain firmly united until our days. We know that it was Archimedes who solved the first problems of statics in his programs of study of levers, based on the programs of study on forces previously carried out by Aristotle.
There is a lack of important writings until the arrival of the Renaissance. However, it is most likely that the construction of churches and cathedrals was based on the already known laws of statics. During the Renaissance there was a resurgence of interest in science and great figures appeared in the fields of architecture and engineering. Leonardo da Vinci addressed, for the first time, basic questions about the bending of beams, the breaking of pillars and the equilibrium of arches, and carried out experiments on the resistance of materials. This began to generate a stimulus and concern among architects and builders of the time.
Galileo stands out as the genius capable of answering this question. In his work on Two New Sciences, he analyzes the resistance of beams to tension and bending, and the buckling of pillars. During the 17th century and the beginning of the 18th century, great mathematical contributions were made based on the differential calculus developed by Newton and Leibniz. Thus began the mathematical theory of stresses. Leibniz carried out the first stress analysis on beams and concluded that the bending moment is proportional to the moment of inertia of the section. Hooke of the English Royal Academy of Sciences formulates the theorem of proportionality between stresses and strains, which constitutes the basic hypothesis of elasticity. In the 18th century Jacob Bemouilli elaborated a theory for locating the neutral fiber in beams, and in his programs of study a complete relation between stresses and strains appears for the first time. Euler and Lagrange made important discoveries in the field of beam deformations and column stability.
From this moment on, it can be said that a new era in the analysis of Structures began. Thus, a deductive process for its knowledge was progressively initiated. This process involves a series of steps such as: identification of the behavior, use of a physical-mathematical procedure for the formulation of a theory, and finally its verification through application to different cases. These theories will be subjected to a continuous check and possible refutation based on how they model reality in their different cases. The deductive process therefore leads to knowing not only what to do and how to do it, but also why.
The end of the 18th century and the first half of the 19th century marked the preponderance of the French school and its Ecoles system. From them emerge great contributions to structural analysis such as those made by Navier, who formulated the three-dimensional theory of linear elasticity, and Cauchy, who established the existence of the stress tensor and the equations of internal equilibrium. During the second half of the 19th century and the first half of the 20th century, important advances were made in the theory of Structures: bending, torsion, plate theory, structural dynamics, photoelasticity, plastic behavior and rupture criteria. A more general science is thus created, which is called the Mechanics of Continuous Media, which accompanies the theory of Structures. The demands of these advances make it necessary to enhance the study of mathematics, strength of materials and theory of the Structures in the training of engineers and architects involved in the structural design .
Reinforced concrete appeared in 1848, but it was not until several decades later that its importance began to be appreciated, and by the end of the 19th century its use in buildings and bridges began to spread. In the first decades of the 20th century, the development of concrete and the improvement of the joining elements in the rolled steel Structures made it possible to build bridges with longer spans, taller buildings and slimmer roofs. All this had a considerable influence on the development analysis of Structures, leading to numerous programs of study on the behavior of hyperstatic arches, continuous beams, slabs, domes, Structures laminar and portal frames (such as the famous method of the American civil engineer Cross developed in 1930, which was the basis for the analysis of numerous Structures aporticadas for decades).
To this era belong Structures emblematic buildings such as the Golden Gate Bridge designed by civil and structural engineers Strauss and Ellis, built in 1937 and with its 1280 meters of span between pillars held the world record until 1964. The Empire State Building built in 1931 with 443 meters high, whose structure was designed by the civil and structural engineer Homer Balcom who also designed the Structures of the Rockfeller Center and the Waldorf-Astoria Hotel built in 1931 and which held the record for the tallest hotel in the world until 1963. In Spain, it is worth mentioning the Structures in sheet metal of the civil engineer Eduardo Torroja, such as the roof of the Zarzuela Racetrack, designed in 1933 and built in 1941, the Recoletos Fronton and the roof of the Algeciras market, both built in 1935.
knowledge Thus, we arrived at the second half of the 20th century with a broad knowledge of the theory of elasticity and plasticity, of the statics and dynamics of Structures, as well as of the properties of materials. Numerical and variational methods and the use of matrix calculus, already outlined in the 19th century, were also known. The main problem was the inability to solve the high issue number of equations generated. It was then that the computer appeared, and all these theories and techniques opened up an unexplored field. The computer, with its computing power, revolutionized not only the analysis of Structures, but all branches of science. With the availability of this new tool, the physical problem can now be analyzed with greater breadth. In the analysis of Structures the previously developed mathematical theories gain their full importance and more complex problems are tackled with greater accuracy.
As a consequence, the application of matrix calculus to the analysis of Structures began to be applied in the aeronautical industry with the use of computers in the early 1950's. Then, in 1956 Ray Clough (professor of civil and structural engineering at the University of California, Berkeley), together with Turner, Martin and Topp extended the matrix formulation giving rise to the well-known Finite Element Method. sabbatical year In 1957 Clough took a leave of absence from his teaching and administrative duties at the University of Trondheim in Norway, where he continued to work on the new method and where he began to call it by the name of finite elements. Basically, the method consists of discretizing the continuum by creating a mesh of elements, and imposing on them the conditions of material behavior and deformation compatibility, and fulfilling equilibrium in an integral way at the nodes of the mesh. The development of all this process was very fast, so that, at the beginning of the 60's, the method was already perfectly established and began to be popular for the analysis of Structures.
Some Berkeley professors doubted the method and proposed to solve a series of cases, among them, the classical problem of stress concentration around an elliptical hole in a steel plate under tension. A student was commissioned to perform the meshing and analysis, which gave very good results. Many criticisms and doubts about the new method were dispelled.
Simultaneously, when mathematicians studied this technique, they realized that it consisted in using an extension of Ritz's variational method using admissible functions within each element. It was then realized that this method is applicable to the solution of all those problems modeled by means of partial derivative equations. As a consequence, in the following years not only its use progresses in the analysis of Structures but also in the analysis of heat transfer and thermodynamic processes, fluid dynamics, aero-elasticity and fluid-structure interaction, flexible mechanisms and multi-bodies, electro-magnetic fields, and even more recently in the area of finance and economic models. Furthermore, when compared to the finite difference method then in vogue, the finite element method was found to have greater generality and versatility.
Likewise, aspects related to numerical analysis and computer programming are becoming very important, so that computational methods are becoming a common tool of work in many engineering fields. Finite element computer programs such as SAP -developed at the University of Berkeley-, STRUDLE -developed at MIT-, NASTRAN -developed by NASA-, appear in the 70's and constitute the basis of the commercial software used today.
During the 70's and 80's, the finite element method together with numerical methods were mostly taught in specialization and graduate courses. However, and due to its enormous extension and use, nowadays it has been adopted to put the student in contact with these methods as soon as possible, being part of the curriculum of the Degree of engineering in most of the national and foreign schools.
As a conclusion of this brief historical review it can be mentioned that programming techniques together with the finite element method have allowed the development of very consolidated and powerful computational tools that allow today to perform very sophisticated analysis and optimize structural designs. As an example, in 2016 a team of engineers from the University of Sydney carried out a study of the Eiffel Tower by finite elements concluding that if this method of analysis had been used, it would have been possible to build the tower with 46% less steel, i.e. practically half. Therefore, we see that these techniques lead us to optimized, more sustainable and environmentally friendly solutions.
As examples of this analysis capacity, it is worth highlighting the Akashi Kaikyo suspension bridge built in 1998 in Japan and designed by the civil and structural engineer Satoshi Kashima. This bridge currently holds the world record between pillars (1991 meters), and as it was built in a seismic zone, it has shock absorbers that operate at the bridge's own frequency in the event of an earthquake. Another example is the Burj Khalifa building built in Dubai in 2009 with a record height of 828 meters, whose steel and concrete structure was designed by civil and structural engineer Bill Baker.
Structural analysis and design is nowadays a consolidated field that requires an intense and specific training not only in mathematics, physics, strength of materials and structural analysis but also in numerical and computational methods such as finite element analysis. And all this accompanied by a professional internship that allows the designer to get enough experience. It is significant that in the Anglo-Saxon and North European professional world, in order to obtain the structural licence (i.e. to sign Structures projects): first, a specialized training on Structures (usually at master level); second, a professional internship (minimum 3 years in the case of the United States); and third, a subsequent state examination.
Where are the challenges of the Structures today and where is the Education, research and innovation going in this area of knowledge? As I have already mentioned this area is considered consolidated and I think that its future will depend on the progress in the area of materials and the new techniques brought by artificial intelligence, machine learning, and big data techniques.
The materials commonly used in the Structures: steel, concrete, wood and aluminum will continue their course until some new material of massive use is found that brings economic advantages and can replace them. Although new materials such as new alloys, composite and ceramic materials, self-repairing concretes, etc., are being developed and used in some specific applications, there are no perceived substitutes for the above materials for the time being.
Artificial intelligence and machine learning techniques are already exerting an important influence on the analysis and design of Structures. Finite element models used for structural problems can be costly both because of the amount of labor time required for development and the computational cost. The use of machine learning techniques allows the development of very accurate, inexpensive, and easy to use metamodels. Deep neural network systems (deep learning) provide excellent results. An example is the chess program LCZero, developed using deep neural networks, which is the current world champion, once computers started to beat humans after the famous Deep Blue computer game against Kasparov in 1997.
However, artificial intelligence models are opaque, non-intuitive and difficult to understand (black box concept). A new paradigm is emerging as Explainable Artificial Intelligence, which is to use artificial intelligence techniques based on scientific methods already developed. In this way, intelligent models are obtained in which it is understood why and how they work. Under this concept other paradigms come into play such as Computational Mechanics Driven by data and Neural Networks Informed by Physical Methods in which neural networks are complemented by structural models that establish relationships with the variables of the different layers of the neuronal network . Therefore, everything known so far about Structures can be included in automated learning techniques in a complementary and intelligent way.
These aspects must have an impact on the training and Education of the designers of Structures who will have to have a deep base and multidisciplinary that includes: the theory of Structures, the behavior of materials, computational methods, as well as machine learning techniques and big data processing. As we have seen the area of Structures was a pioneer in the application of computational techniques (finite elements are test of it), has now as challenge continue this trajectory in the environment of artificial intelligence and design automated. It is time to generate from academia a technical and scientific curriculum solid enough to meet these challenges.
I would like to take this opportunity to thank the University of Navarra and School of Architecture for the opportunity they have given me during all these years to research and teach teaching in this very attractive area of knowledge .
Thank you very much.