neoGeorge's Build: One Reef & One Planted Freshwater

OP
OP
neoGeorge

neoGeorge

Well-Known Member
View Badges
Joined
Oct 15, 2016
Messages
614
Reaction score
1,727
Location
Phoenix
Rating - 0%
0   0   0
Constructing crates for the existing Marineland 60g and JBJ Nano Cube 28g; although I don't have any plans to reuse these tanks, they will have to get moved to AZ since the fish and corals are coming out just before we're ready to hit the road.

AquariumCrate-3599.jpg


AquariumCrate-3602.jpg
 
OP
OP
neoGeorge

neoGeorge

Well-Known Member
View Badges
Joined
Oct 15, 2016
Messages
614
Reaction score
1,727
Location
Phoenix
Rating - 0%
0   0   0
Accidents Don’t Just Happen—They Have to Happen
Our very attempts to stave off disaster make unpredictable outcomes more likely.

Read in The Atlantic: https://apple.news/AxDIeJCtxTKKBNhPSWC_K6w

Something to keep in mind as we work to prevent disasters in our reef tanks. The following is quoted from The Atlantic article:


"Accidents are part of life. So are catastrophes. Two of Boeing’s new 737 Max 8 jetliners, arguably the most modern of modern aircraft, crashed in the space of less than five months. A cathedral whose construction started in the 12th century burned before our eyes, despite explicit fire safety procedures and the presence of an on-site firefighter and a security agent. If Notre Dame stood for so many centuries, why did safeguards unavailable to prior generations fail? How did modernizing the venerable Boeing 737 result in two horrific crashes, even as, on average, air travel is safer than ever before?

These are questions for investigators and committees. They are also fodder for accident theorists. Take Charles Perrow, a sociologist who published an account of accidents occurring in human-machine systems in 1984. Now something of a cult classic, Normal Accidents made a case for the obvious: Accidents happen. What he meant is that they must happen. Worse, according to Perrow, there’s a humbling cautionary tale lurking in complicated systems: Our very attempts to stave off disaster by introducing safety systems ultimately increases the overall complexity of the systems, ensuring that some unpredictable outcome will rear its ugly head no matter what. Complicated human-machine systems might surprise us with outcomes more favorable than we had any reason to expect. They also might shock us with catastrophe.

When disaster strikes, past experience has conditioned the public to assume that hardware upgrades or software patches will solve the underlying problem. This indomitable faith in technology is hard to challenge—what else solves complicated problems? But sometimes our attempts to banish accidents make things worse.

...
What makes the Boeing disaster so frustrating is the relative obviousness of the problem in retrospect. Psychologists and economists have a term for this; it’s called the “hindsight bias,” the tendency to see causes of prior events as obvious and predictable, even when the world has no clue leading up to them. Without the benefit of hindsight, the complex causal sequences leading to catastrophe are sometimes impossible to foresee. But in light of recent tragedy, theorists like Perrow would have us try harder anyway. Trade-offs in engineering decisions necessitate an eternal vigilance against the unforeseen. If some accidents are a tangle of unpredictability, we’d better spend more time thinking through our designs and decisions—and factoring in the risks that arise from complexity itself.

...
The increasing complexity of modern human-machine systems means that, depressingly, unforeseen failures are typically large-scale and catastrophic. The collapse of the real estate market in 2008 could not have happened without derivatives designed not to amplify financial risk, but to help traders control it. Boeing would never have put the 737 Max’s engines where it did, but for the possibility of anti-stall software making the design “safe.”

In response to these risks, we play the averages. Overall, air travel is safer today than in, say, the 1980s. Centuries old cathedrals don’t burn, on average, and planes don't crash. Stock markets don’t, either. On average, things usually work. But our recent sadness forces a reminder that future catastrophes require more attention to the bizarre and (paradoxically) to the unforeseen. Our thinking about accidents and tragedies has to evolve, like the systems we design. Perhaps we are capable of outsmarting complexity more often. Sometimes, though, our recognition of what we’ve done will still come too late."
 

crusso1993

7500 Club Member
View Badges
Joined
Oct 21, 2018
Messages
8,671
Reaction score
44,649
Location
SW, FL, USA
Rating - 0%
0   0   0
Accidents Don’t Just Happen—They Have to Happen
Our very attempts to stave off disaster make unpredictable outcomes more likely.

Read in The Atlantic: https://apple.news/AxDIeJCtxTKKBNhPSWC_K6w

Something to keep in mind as we work to prevent disasters in our reef tanks. The following is quoted from The Atlantic article:


"Accidents are part of life. So are catastrophes. Two of Boeing’s new 737 Max 8 jetliners, arguably the most modern of modern aircraft, crashed in the space of less than five months. A cathedral whose construction started in the 12th century burned before our eyes, despite explicit fire safety procedures and the presence of an on-site firefighter and a security agent. If Notre Dame stood for so many centuries, why did safeguards unavailable to prior generations fail? How did modernizing the venerable Boeing 737 result in two horrific crashes, even as, on average, air travel is safer than ever before?

These are questions for investigators and committees. They are also fodder for accident theorists. Take Charles Perrow, a sociologist who published an account of accidents occurring in human-machine systems in 1984. Now something of a cult classic, Normal Accidents made a case for the obvious: Accidents happen. What he meant is that they must happen. Worse, according to Perrow, there’s a humbling cautionary tale lurking in complicated systems: Our very attempts to stave off disaster by introducing safety systems ultimately increases the overall complexity of the systems, ensuring that some unpredictable outcome will rear its ugly head no matter what. Complicated human-machine systems might surprise us with outcomes more favorable than we had any reason to expect. They also might shock us with catastrophe.

When disaster strikes, past experience has conditioned the public to assume that hardware upgrades or software patches will solve the underlying problem. This indomitable faith in technology is hard to challenge—what else solves complicated problems? But sometimes our attempts to banish accidents make things worse.

...
What makes the Boeing disaster so frustrating is the relative obviousness of the problem in retrospect. Psychologists and economists have a term for this; it’s called the “hindsight bias,” the tendency to see causes of prior events as obvious and predictable, even when the world has no clue leading up to them. Without the benefit of hindsight, the complex causal sequences leading to catastrophe are sometimes impossible to foresee. But in light of recent tragedy, theorists like Perrow would have us try harder anyway. Trade-offs in engineering decisions necessitate an eternal vigilance against the unforeseen. If some accidents are a tangle of unpredictability, we’d better spend more time thinking through our designs and decisions—and factoring in the risks that arise from complexity itself.

...
The increasing complexity of modern human-machine systems means that, depressingly, unforeseen failures are typically large-scale and catastrophic. The collapse of the real estate market in 2008 could not have happened without derivatives designed not to amplify financial risk, but to help traders control it. Boeing would never have put the 737 Max’s engines where it did, but for the possibility of anti-stall software making the design “safe.”

In response to these risks, we play the averages. Overall, air travel is safer today than in, say, the 1980s. Centuries old cathedrals don’t burn, on average, and planes don't crash. Stock markets don’t, either. On average, things usually work. But our recent sadness forces a reminder that future catastrophes require more attention to the bizarre and (paradoxically) to the unforeseen. Our thinking about accidents and tragedies has to evolve, like the systems we design. Perhaps we are capable of outsmarting complexity more often. Sometimes, though, our recognition of what we’ve done will still come too late."

Kind of a foreboding message but one that tells truth. I like it. I'm not a big believer in sugar-coating or hiding the truth because it may be considered dark or scare someone or hurt their feelings. As a matter of fact, I believe it is our duty to share such messages and help people to determine what they will do with them, where possible. Thanks for sharing!
 
OP
OP
neoGeorge

neoGeorge

Well-Known Member
View Badges
Joined
Oct 15, 2016
Messages
614
Reaction score
1,727
Location
Phoenix
Rating - 0%
0   0   0
Kind of a foreboding message but one that tells truth. I like it. I'm not a big believer in sugar-coating or hiding the truth because it may be considered dark or scare someone or hurt their feelings. As a matter of fact, I believe it is our duty to share such messages and help people to determine what they will do with them, where possible. Thanks for sharing!

You're welcome @crusso1993 ! Isn't there a hashtag #truthinreefing (may have this wrong, remember seeing a tag like this from @NY_Caveman )
 

count krunk

Valuable Member
View Badges
Joined
Jul 17, 2017
Messages
2,316
Reaction score
4,243
Location
ATL, GA
Rating - 0%
0   0   0
i recently read a book by Julian Spilsbury - Great Military Disasters.

If you are into history, I recommend reading it. You can see the thinking that led to many of these disasters. Also gives people an insight into generals and commanding an army.

it shows that is is not just modern thinking of playing averages and trusting what is not there.

It more shows that human nature is to do these things.

The book starts with Mount Tabor 1125 BC and ends with Dien Bien Phu 1954.
 

crusso1993

7500 Club Member
View Badges
Joined
Oct 21, 2018
Messages
8,671
Reaction score
44,649
Location
SW, FL, USA
Rating - 0%
0   0   0
Critters are in the coolers. How they will weather the next 3 days and 2000 miles is an unknown; moving the aquariums on top of moving a whole household pegs the exhaustion meter for sure.

NanoMove-2-2.jpg
NanoMove-2.jpg

Good luck!
 
OP
OP
neoGeorge

neoGeorge

Well-Known Member
View Badges
Joined
Oct 15, 2016
Messages
614
Reaction score
1,727
Location
Phoenix
Rating - 0%
0   0   0
16322140-3987-4A51-8247-9812A4C2FE1B.jpeg
Also did some scaping In the reef aquarium. There was some unintentional fragging of my potato chip coral and trumpet coral, so I have distributed those frags in the tank. Some sad news, my tri-colored wrasse got her tail caught in the wave maker, and is not looking so good. The six line wrasse had been harassing her and I think this was part of what caused the issue. Really hope she makes it, she’s in the isolation box now to recover
 

Tentacled trailblazer in your tank: Have you ever kept a large starfish?

  • I currently have a starfish in my tank.

    Votes: 60 32.1%
  • Not currently, but I have kept a starfish in the past.

    Votes: 49 26.2%
  • I have never kept a starfish, but I hope to in the future.

    Votes: 38 20.3%
  • I have no plans to keep a starfish.

    Votes: 38 20.3%
  • Other.

    Votes: 2 1.1%
Back
Top