Showing posts with label design theory. Show all posts
Showing posts with label design theory. Show all posts

20 July 2011

Footbridge 2011: papers roundup pt 3

Time to cover two more papers from Footbridge 2011.

There were a number of papers describing bridges which were, at best, odd, and at worst, downright awful. I won't embarrass the authors here, but the most striking examples were all cases where the architect had been let loose on their own and the engineer left to pick up the pieces later*. It was quite a shame to see such highly talented engineers being employed in this way. (*There were also a number of engineer-designed bridges which could have been improved no end by the presence of a sensitive architect!)

One case which left me with more ambiguous feelings was Tim Black's presentation Optimisation in footbridge design. This is a subject I wrote about recently, so I was keen to see the talk. Black is a director of BKK Architects, who had collaborated with RMIT University's Innovative Structures Group to attempt a new approach to tubular footbridges. They took the basic cylindrical form and applied "BESO" (bi-directional evolutionary structural optimisation), a form of topological optimisation, to it, allowing the process to eliminate and rebuild areas of material in response to analytical criteria such as stress and stiffness.

The resulting design is illustrated above. It isn't what the architects expected (they anticipated a more regular perforated tube), and nor is it what an engineer would expect, as it lacks the symmetry you would expect on a simple symmetrical design problem (a simply supported beam). This is because the architects have steered the design process to suit their preconceptions: they have extracted a segment of geometry from the solution which can be repeatedly tiled both around and long the tube, imagining that the correct approach for ease of real-world fabrication is to maximise repetition. Indeed, they have moved on to digital fabrication and precast prototyping.

No engineer would expect an optimised geometry to be tileable, instead, it would be reasonable to expect material to "collect" in the upper and lower walls of the tube according to the bending moment diagram, and to form diagonal elements in the side walls according to the shear force diagram. I like the idea that new structural forms can emerge by "growing" rather than designing a structure, but the demands of construction (ease of fabrication) and the demands of material efficiency (curved and complex elements) are opposed, and it is not easy to imagine how they will be reconciled. A more rigorous and engineer-led approach to optimisation may yet lead to interesting designs, however, and it's good to see architects thinking in this way as well.

Another paper which echoed a subject I've covered here was Markus Hennecke's Pre-stressed granite bridges: a new generation of granite bridges. This was a showcase for Kusser Aicha Ganitwerke's bridges, which achieve exceptionally high span to depth ratios, as much as 50:1. Their bridge at Stevenage (pictured above), installed late last year, manages 49:1. The presentation attracted a keen engineering interest, with many questioners clearly looking to be persuaded on subjects such as local bursting stresses and cable protection. Indeed, bursting stresses may represent a key constraint on the range of designs achievable, as they are resisted solely by the tensile strength of the granite. I wonder whether that couldn't be extended by some form of localised strapping system, however.

Reading back through the conference proceedings, there are several other bridges which would merit attention here, and some very interesting design concepts to store away in my "for future use" folder. However, I want to move on. I'll put together a couple more posts on some of the most interesting bridges shown at the conference, and I also want to cover some of the bridges in Wrocław itself.

24 May 2011

The space of all possible bridge shapes: Part 3

In the last two posts, I introduced Stephen Wolfram's idea that it may be possible to use modern computational algorithms to develop entirely new structural forms for bridge, and discussed how difficult it might be to work out whether any of them are actually better than what we design already.

A more coherent idea of the problem can be seen in the example truss forms that Wolfram generated to illustrate his article.

Would any of these offer any improvement on a more conventional truss design? Perhaps there is a greater robustness, but it is unlikely the benefit-to-cost ratio is anywhere close to what can be achieved by simply using a conventional truss form with stronger individual members.

It's a classic case where a specialist with little or no familiarity with another field (in this case, bridge engineering) thinks they can bring some special insight which others are blind to. It's something seen frequently in evolutionary biology, where criticism of neo-Darwinian theory generally comes from biochemists or (sadly) engineers with an essentially shallow understanding of the topic.

However, I wouldn't dismiss Wolfram entirely. There's little doubt that bridge engineers are highly constrained by habit - their own design experiences, and the traditional forms which have calcified into the standard processes of bridge fabricators. Often, new bridge concepts are dead-ended by the inability of steel and precast suppliers to invest in new equipment and technology. Even where new technology is brought in, as with the robotic welding more commonly seen in Japan than in North America or Europe, it is applied only to a very specific problem (e.g. welding of orthotropically stiffened steel plate), rather than to more radically expanding the range of what can be built economically.

As ever, footbridges offer an area where designers can experiment with smaller economic consequence, and are often encouraged to do so by promoters' ambitions. I'm quite confident there are ideas out there in academia, of which Wolfram's is just one, which are underused (or never used) by designers even in this most adventurous of bridge-building fields.

One field which is just about making it into the "real world" is that of topology optimisation. This has been used to show that the theoretically ideal catenary cable is not a single cable but a multi-stranded Hencky net,
and there are various other examples online relating directly to bridge engineering. One paper documents its application to the design of the Knokke Footbridge (pictured, right), which has been featured here previously. This shows the use of computer processing to progressively optimise steel plate thickness (or determine where it can be omitted), something that could have wider applications not just in optimising existing forms but also in generating new ones.

Was Wolfram right that we will see entirely new bridge forms which surprise us in their novelty and apparent randomness? I doubt it, but I do think there's plenty of scope to take use some of the methods discussed here to take a fresh look at bridge designs now and in the future.

23 May 2011

The space of all possible bridge shapes: Part 2

In the previous post I discussed Stephen Wolfram's proposition that there exists a space of all possible bridge designs, and that if we could use modern computer techniques to generate this using a set of simple rules, we could find new and unpredictable bridge forms within that space which may improve on traditional ideas.

A key challenge is how to find the better designs, a process which involves testing each option against whichever criteria are required. This can be computationally intensive, but in the age of cloud computing, becomes a little more feasible. Sensible engineers might reasonably object that to analyse a non-trivial array of structural models would defeat even the greatest computing resources currently available, and I would sympathise.

I wonder, however, whether this isn't primarily a flaw of traditional analytical technologies such as the finite-element stiffness matrix method.

I recall a project from some years ago (VISABO) which used Newtonian mechanics in a manner more closely related to Wolfram's cellular automata, exploiting "intelligent" structural elements each of which contained their own rules of physics, global behaviour emerging naturally from their relationships. This has the potential to allow the change of structural response resulting from a change in structural form to be analysed much more quickly: individual members react dynamically when another member is moved, added, or eliminated.

Another, perhaps more accessible example, is the series of Bridge Builder games (pictured above right), which appear to use the same principle (they certainly don't use finite element analysis!)

Closer to the professional arena, there is Daniel Piker's Kangaroo (pictured left), an add-on for Grasshopper / Rhino which carries out a similar physics-based simulation, and is being explicitly promoted for structural modelling purposes e.g. form-finding of catenary structures.

Nonetheless, I think that non-trivial analysis may still remain computationally too expensive, particularly for structures governed by continuum rather than discrete element behaviour (such as beams and frames), or where non-linear, dynamic or global buckling behaviour determine performance.

Analysis of the individual designs is only half the problem: it's also necessary to test them against pre-defined criteria to decide which are optimal (or, at least, superior to neighbouring designs). Researchers like Wolfram seem to believe that "economy" is readily measurable e.g. by least material. However, real-life economy in bridges is intimately linked to simplicity of construction, and structures which are regular and repetitious are generally cheaper to manufacture and assemble than those which are highly variable. A classic example is the simple rolled steel beam, which contains considerable quantities of material resisting very low stress, yet is almost always cheaper to supply than a latticework or variable section plate girder beam where the stresses have been made more uniform.

For trusses of the sort that Wolfram takes as his example, it is likely that his process will find an optimum 2-dimensional truss with irregular bay sizes or truss angles, reflecting the variation of shear; and with curved top and bottom chords, reflecting the variation of bending moment. But in 3-dimensions, truss members do more than carry shear and bending, they also resist out-of-plane buckling, and regular bays can make the deck design more economic. Curved chord members can similarly increase fabrication costs to a greater degree than the more uniform stress saves material. How then can economy be easily assessed?

If economy is difficult, what of robustness? How can that be readily measured in a manner which is quickly repeatable across a large array of possible designs?

To be continued ...

22 May 2011

The space of all possible bridge shapes: Part 1

I guess this is an old one now, dating back to 2007, but I hadn't seen it before.

Shortly after the collapse of the I-35W Mississippi River Bridge in August 2007 (pictured right, courtesy of pmarkham), Stephen Wolfram published a blog post titled "The space of all possible bridge shapes", wondering whether new developments in science could have anything to offer to bridge designers and hence help prevent future disasters. In order to come up with designs which maximise robustness while minimising economy, Wolfram speculates that designers will need to find entirely new structural forms, which may look nothing like those that have emerged from engineering history.

Wolfram is the developer of the popular Mathematica software, and a researcher into computational systems such as cellular automata. The best known of such systems is perhaps John Conway's Game of Life (pictured left, courtesy of kieff at Wikipedia), which demonstrates in a very graphic way how a wide spectrum of behaviour both random and structured can emerge from applying simple rules to the on/off state of image pixels. Genetic algorithms can be used to mutate the results, compare them against various tests, and "evolve" the system over many generations in search of some desired optimum. I have seen some experimental use of genetic algorithm techniques by architects in building design, but in structural engineering it seems to be largely confined to the academics (one of many examples here).

Wolfram's main interest is in the ability of very simple systems to be processed and combined by simple rules to create highly complex outcomes. The range of possible outcomes forms a kind of computational landscape, which can be investigated to determine whether there are useful results other than those that might have initially been predicted. Some of this is explored in Wolfram's book, A New Kind of Science.

Wolfram notes that before the 19th century, there were only a limited number of bridge forms in use (the beam, the arch etc), but with the advent of the railway age, a Cambrian explosion in bridge shapes occurred, all variations on the metal truss. As in the evolution of organisms, a certain feature had to arise before a wide array of new forms could build upon the opportunities it presented (the evolution of evolvability). This image of truss variations is taken from Wolfram's blog post:

Most of the famous truss types (Warren, Pratt, Howe, Fink etc) arose through a process not dissimilar to natural selection: inventors of truss forms were competing in terms of strength, ease of construction, and economy, and simple economics meant that only the fittest survived. It would be interesting to trace the history of the metal truss bridge through some kind of developmental tree, complete with extinctions, hybridisation etc.

As an aside, the generation of truss forms using simple rules was the subject of an interesting paper by Yoshiaki Kubota at IABSE's Venice symposium, which I discussed here before. It forms a subset of the wider systematisation of bridge types, as illustrated in one of Kubota's diagrams below:
Wolfram's proposition is that a wide range of otherwise unpredictable variations in form can be readily generated by combinations of simple rules e.g. add a brace, subtract a brace, subdivide a bay, shorten, lengthen, rotate. It is therefore straightforward to generate a multi-dimensional "design space" containing a myriad of options which a rational designer would never consider. The question is then whether any better designs exist within the space of possible bridge shapes, and Wolfram's experience in other areas makes him believe quite strongly that they would. His other work also suggests they may look like nothing we have seen before, possible quite "random" in appearance.

This post is getting quite long, so I'll continue this tomorrow.