Home Blog Page 67

Most complete Tasmanian tiger genome yet pieced together from 110-year-old pickled head

0
Most complete Tasmanian tiger genome yet pieced together from 110-year-old pickled head

Scientists have assembled the most complete Tasmanian tiger genome to date from a century-old pickled head, providing a full DNA blueprint to potentially bring the extinct species back to life.

The breakthrough — one of several new advances in Tasmanian tiger de-extinction efforts spearheaded by the company Colossal Biosciences — was made possible thanks to a 110-year-old head that was skinned and preserved in ethanol. The exceptional preservation of this specimen enabled researchers to piece together most of its DNA sequence, as well as strands of RNA (a molecule that is structurally similar to DNA but has only one strand) that show which genes were active in various tissues when the animal died.

34.6 High-temperature Superconductors – College Physics

0
34.6 High-temperature Superconductors – College Physics

Summary

  • Identify superconductors and their uses.
  • Discuss the need for a high-Tc superconductor.

Superconductors are materials with a resistivity of zero. They are familiar to the general public because of their practical applications and have been mentioned at a number of points in the text. Because the resistance of a piece of superconductor is zero, there are no heat losses for currents through them; they are used in magnets needing high currents, such as in MRI machines, and could cut energy losses in power transmission. But most superconductors must be cooled to temperatures only a few kelvin above absolute zero, a costly procedure limiting their practical applications. In the past decade, tremendous advances have been made in producing materials that become superconductors at relatively high temperatures. There is hope that room temperature superconductors may someday be manufactured.

Superconductivity was discovered accidentally in 1911 by the Dutch physicist H. Kamerlingh Onnes (1853–1926) when he used liquid helium to cool mercury. Onnes had been the first person to liquefy helium a few years earlier and was surprised to observe the resistivity of a mediocre conductor like mercury drop to zero at a temperature of 4.2 K. We define the temperature at which and below which a material becomes a superconductor to be its critical temperature, denoted by [latex]{T_c}[/latex]. (See Figure 1.) Progress in understanding how and why a material became a superconductor was relatively slow, with the first workable theory coming in 1957. Certain other elements were also found to become superconductors, but all had [latex]{T_c}[/latex] s less than 10 K, which are expensive to maintain. Although Onnes received a Nobel prize in 1913, it was primarily for his work with liquid helium.

In 1986, a breakthrough was announced—a ceramic compound was found to have an unprecedented [latex]{T_c}[/latex] of 35 K. It looked as if much higher critical temperatures could be possible, and by early 1988 another ceramic (this of thallium, calcium, barium, copper, and oxygen) had been found to have [latex]{T_c = 125 \;\text{K}}[/latex] (see Figure 2.) The economic potential of perfect conductors saving electric energy is immense for [latex]{T_c}[/latex] s above 77 K, since that is the temperature of liquid nitrogen. Although liquid helium has a boiling point of 4 K and can be used to make materials superconducting, it costs about $5 per liter. Liquid nitrogen boils at 77 K, but only costs about $0.30 per liter. There was general euphoria at the discovery of these complex ceramic superconductors, but this soon subsided with the sobering difficulty of forming them into usable wires. The first commercial use of a high temperature superconductor is in an electronic filter for cellular phones. High-temperature superconductors are used in experimental apparatus, and they are actively being researched, particularly in thin film applications.

34.6 High-temperature Superconductors – College Physics
Figure 1. A graph of resistivity versus temperature for a superconductor shows a sharp transition to zero at the critical temperature Tc. High temperature superconductors have verifiable Tc s greater than 125 K, well above the easily achieved 77-K temperature of liquid nitrogen.
The figure shows a button-shaped magnet floating above a superconducting puck. Some wispy fog is flowing from the puck.
Figure 2. One characteristic of a superconductor is that it excludes magnetic flux and, thus, repels other magnets. The small magnet levitated above a high-temperature superconductor, which is cooled by liquid nitrogen, gives evidence that the material is superconducting. When the material warms and becomes conducting, magnetic flux can penetrate it, and the magnet will rest upon it. (credit: Saperaud)

The search is on for even higher [latex]{T_c}[/latex] superconductors, many of complex and exotic copper oxide ceramics, sometimes including strontium, mercury, or yttrium as well as barium, calcium, and other elements. Room temperature (about 293 K) would be ideal, but any temperature close to room temperature is relatively cheap to produce and maintain. There are persistent reports of [latex]{T_c}[/latex] s over 200 K and some in the vicinity of 270 K. Unfortunately, these observations are not routinely reproducible, with samples losing their superconducting nature once heated and recooled (cycled) a few times (see Figure 3.) They are now called USOs or unidentified superconducting objects, out of frustration and the refusal of some samples to show high [latex]{T_c}[/latex] even though produced in the same manner as others. Reproducibility is crucial to discovery, and researchers are justifiably reluctant to claim the breakthrough they all seek. Time will tell whether USOs are real or an experimental quirk.

The theory of ordinary superconductors is difficult, involving quantum effects for widely separated electrons traveling through a material. Electrons couple in a manner that allows them to get through the material without losing energy to it, making it a superconductor. High- [latex]{T_c}[/latex] superconductors are more difficult to understand theoretically, but theorists seem to be closing in on a workable theory. The difficulty of understanding how electrons can sneak through materials without losing energy in collisions is even greater at higher temperatures, where vibrating atoms should get in the way. Discoverers of high [latex]{T_c}[/latex] may feel something analogous to what a politician once said upon an unexpected election victory—“I wonder what we did right?”

Figure a is a graph of resistivity versus temperature. The resistivity goes from zero to zero point six milli ohm centimeters and the temperature goes from one hundred to three hundred kelvin. There are three curves on the graph. The first curve starts near zero point one milli ohm centimeters, one hundred kelvin, and increases linearly to zero point six milli ohm centimeters, two hundred and eighty kelvin. The second curve is at zero resistivity from 100 kelvin to about two hundred and thirty five kelvin, then jumps straight up to zero point four milli ohm centimeters, after which it increases linearly with temperature with the same slope as the first curve. The third curve has one point at minus zero point zero five milli ohm centimeters at about one hundred and thirty kelvin, then becomes positive and increases essentially linearly with the same slope as the first curve. Figure b shows a scaffolding structure made up of rods. At each vertex in the scaffold there is a ball that is either white, red, purple, or blue. Each color represents a different kind of atom. The white balls are the largest, then the red, then the purple, and the blue balls are the smallest. The balls are arranged in a systematic pattern. From bottom to top the scaffold layers are formed from white and red balls, then red and blue balls, then purple balls, then again red and blue balls, then finally white and red balls again. In each individual layer the balls form various grid patterns. This scaffold structure forms a brick-like shape and an identical such brick is positioned above it with a gap between the two bricks. The two bricks are connected together by a single layer of blue balls.
Figure 3. (a) This graph, adapted from an article in Physics Today, shows the behavior of a single sample of a high-temperature superconductor in three different trials. In one case the sample exhibited a Tc of about 230 K, whereas in the others it did not become superconducting at all. The lack of reproducibility is typical of forefront experiments and prohibits definitive conclusions. (b) This colorful diagram shows the complex but systematic nature of the lattice structure of a high-temperature superconducting ceramic. (credit: en:Cadmium, Wikimedia Commons)
  • High-temperature superconductors are materials that become superconducting at temperatures well above a few kelvin.
  • The critical temperature [latex]{T_c}[/latex] is the temperature below which a material is superconducting.
  • Some high-temperature superconductors have verified [latex]{T_c}[/latex] s above 125 K, and there are reports of [latex]{T_c}[/latex] s as high as 250 K.

Conceptual Questions

1: What is critical temperature [latex]{T_c}[/latex]? Do all materials have a critical temperature? Explain why or why not.

2: Explain how good thermal contact with liquid nitrogen can keep objects at a temperature of 77 K (liquid nitrogen’s boiling point at atmospheric pressure).

3: Not only is liquid nitrogen a cheaper coolant than liquid helium, its boiling point is higher (77 K vs. 4.2 K). How does higher temperature help lower the cost of cooling a material? Explain in terms of the rate of heat transfer being related to the temperature difference between the sample and its surroundings.

Problems & Exercises

1: A section of superconducting wire carries a current of 100 A and requires 1.00 L of liquid nitrogen per hour to keep it below its critical temperature. For it to be economically advantageous to use a superconducting wire, the cost of cooling the wire must be less than the cost of energy lost to heat in the wire. Assume that the cost of liquid nitrogen is $0.30 per liter, and that electric energy costs $0.10 per kW·h. What is the resistance of a normal wire that costs as much in wasted electric energy as the cost of liquid nitrogen for the superconductor?

Glossary

Superconductors
materials with resistivity of zero
critical temperature
the temperature at which and below which a material becomes a superconductor

Solutions

Problems & Exercises

1: [latex]{0.30 \;\Omega}[/latex]

 

NASA still unsure when Boeing’s Starliner will fly astronauts again

0
NASA still unsure when Boeing’s Starliner will fly astronauts again

NASA is still unsure when it will next put astronauts on Boeing’s Starliner spacecraft, which experienced issues during its first crewed test flight this summer.

Starliner’s next “potential” crewed mission to the International Space Station (ISS) in 2025 “will be determined once a better understanding of Boeing’s path to system certification is established,” NASA officials wrote in a statement on Tuesday (Oct. 15).

Top Branches of Mechanical Engineering

0
Top Branches of Mechanical Engineering

Mechanical Engineering is an essential discipline of engineering encompassing many specializations, with each contributing its unique aspect to the dynamic and inventive nature of this field. With advancements in technology and growth in industries, various branches within Mechanical Engineering continually evolve providing a plethora of captivating career opportunities.

Top Branches of Mechanical Engineering
Mechanical Engineering

Today, we shall closely look at the major sectors of Mechanical Engineering, examining the unique roles, applications, and innovative progress each one brings to the engineering world. Covering both traditional areas like thermodynamics and biomedical engineering and newer fields such as nanotechnology and robotics, this guide will aid you in determining if a Bachelor of Science in Mechanical EngineeringOpens in a new tab. aligns with your career aspirations.

1.   Thermodynamics

The conversion of energy and heat is a crucial aspect of mechanical engineering, which requires close consideration of thermodynamics. Within this field, engineers bear the responsibility for devising an array of systems including HVAC units, engines, and refrigeration setups.

This is because their chief priority does not simply lie in providing effective solutions but also in ensuring that sustainability is championed through eco-friendliness. This is also coupled with optimal usage of energy resources while being mindful of environmental implications.

1.   Nanotechnology

The realm of mechanical engineering has a new emerging field called nanotechnology which is an engaging and future-oriented area that emphasizes the manipulation of substances at either a molecular or atomic level. Within this field lies the responsibility for engineers to design materials and devices with applications ranging from electronic components to medical purposes.

Due to its minute scale, conventional engineering techniques may prove unsuitable in certain cases making it challenging. Therefore creating innovative solutions on a nano-scale demands both imaginative thinking as well as deep knowledge in physics/chemistry disciplines too.

2.   Robotics

Mechanical engineering’s subdivision, robotics, is swiftly growing and transforming our technology engagement. It comprises skilled engineers who create robots specialized in industrial or medical roles that operate much like humans would under challenging conditions: efficiently and flawlessly. The objective of these experts is to design mechanically sound frameworks for the machinesOpens in a new tab. ensuring successful execution without setbacks often associated with human limitations such as inconsistency.

3.   Manufacturing Engineering

Manufacturing engineering focuses on enhancing and overseeing manufacturing processes. It encompasses an array of techniques that include traditional methods and even 3D printing technology. Experts in this field are committed to maximizing efficiency, and minimizing costs while ensuring top-notch product quality. They also lead efforts aimed at discovering revolutionary methodologies that help address the demands across multiple industries.

4.   Biomedical Engineering

Biomedical engineering is where mechanical principles meet the human body. Engineers in this field work on developing medical devices, prostheticsOpens in a new tab., artificial organs, and diagnostic equipment. The complexity of the human body presents unique challenges, requiring solutions that are not only effective but also safe and compatible with biological systems. This branch demands a deep understanding of both engineering principles and biological sciences, aiming to enhance the quality of healthcare and patient outcomes.

5.   Transportation Systems

Transportation engineering involves designing and developing efficient, safe, and sustainable modes of transportation, such as cars, trains, and aircraft. Engineers in this field tackle various challenges, from aerodynamics for vehicles to the safety and efficiency of mass transit systems. They work on optimizing design for performance, fuel efficiency, and environmental impact, ensuring that transportation systems meet the growing demands of modern society.

Endnote

Each mechanical engineering branch plays a crucial role in advancing technology and improving our daily lives. As it continues to evolve, these specializations offer exciting opportunities for innovation and problem-solving in a wide range of industries.

Sachin Thorat

Sachin is a B-TECH graduate in Mechanical Engineering from a reputed Engineering college. Currently, he is working in the sheet metal industry as a designer. Additionally, he has interested in Product Design, Animation, and Project design. He also likes to write articles related to the mechanical engineering field and tries to motivate other mechanical engineering students by his innovative project ideas, design, models and videos.

Recent Posts

link to Shree Ram Ayodhya Murti, idol – Vector , Wallart

Shree Ram Ayodhya Murti, idol – Vector , Wallart

The Ram Lalla idol, which is installed at Ayodhya’s Ram temple has many significant religious symbols from Hinduism. All 10 incarnations of Lord Vishnu are engraved on the idol. Notably, Lord Ram is…

link to Types Of Hammers and their Uses

Types Of Hammers and their Uses

Hammers are some of the most essential tools in any toolbox. They come in different types and sizes, each designed for specific purposes. If you’re doing any kind of construction work, DIY…

Tuneable Light Waves for Optical Sensing

0
Tuneable Light Waves for Optical Sensing


OFS AcoustiSens® Optical Fibers used in random OPO system demonstration 

Once again, OFS optical fibers are paving the way for researchers to bring cutting-edge technology out of the lab and into practical applications. This time, we’re delving into the realm of optical fiber sensing – a technology that relies on a carefully tuned light source with specific traits like wavelength, power, and pulse width. 

Generally optical fiber sensing starts with a laser, but they come with a catch: lasers have their materials carefully selected to emit stable light pulses at a specific desired wavelength, limiting their flexibility. A system with wavelength modulation promises exciting innovations for fields as diverse as quantum computing and LiDAR sensing.  

Tuneable Light Waves for Optical Sensing
OPOs can use the deliberate scattering in AcoustiSens optical fiber to change the wavelength of light pulses

Enter the optical parametric oscillator (OPO). It transforms regular laser light into controlled wavelength pulses by guiding the laser light into an optical cavity, bouncing it around nonlinear crystals and resonators. As the light moves through the cavity and is sent back over itself multiple times the system changes wavelengths and creates parametric amplification.  

However, there’s a hiccup in this dazzling performance: OPOs are quite sensitive to temperature and environmental changes. Even small changes impact the wavelength and power of the light as it exits the cavity, confining OPOs mostly to high-maintenance lab settings. 

Researchers theorized that a random laser, which encourages scatter in the light source, would make the system more robust because the scattering would come from the controlled design of the laser and not be at the mercy of environmental changes in the optical cavity. 

A groundbreaking paper from the University of Ottawa validates this concept. A team demonstrated, for the first time, that an augmented sensing optical fiber like OFS’ AcoustiSens can make this idea a reality. AcoustiSens is manufactured with enhanced Rayleigh scattering and this scattering allowed the OPO system to have stable, tuned wavelengths in a simple and robust optical cavity. 

Congratulations to the University of Ottawa team and to all the technologists working to unshackle OPOs from the lab. 


Tags: fiber optic sensing, optical fiber, optical sensing, sensing



User:SergeyAleshin – Wikipedia

0
User:SergeyAleshin – Wikipedia

From Wikipedia, the free encyclopedia

User:SergeyAleshin – Wikipedia

De-extinction company Colossal claims it has nearly complete thylacine genome

0
De-extinction company Colossal claims it has nearly complete thylacine genome

De-extinction company Colossal claims it has nearly complete thylacine genome

Thylacines, or Tasmanian tigers, went extinct in 1936

Colossal Biosciences

The genome of the extinct thylacine has been nearly completely sequenced, de-extinction company Colossal has announced. It says the genome is more than 99.9 per cent complete, with just 45 gaps that will soon be closed – but it has provided no evidence to back up its claim.

“It’s a fairly difficult thing to get a fully complete genome of almost any organism,” says Emilio Mármol-Sánchez at the University of Copenhagen, Denmark, whose team was the first to extract RNA from a preserved thylacine. For example, the last few holdouts of the human genome were only fully sequenced in the past few years.

Thylacines, also known as Tasmanian tigers, were carnivorous marsupials once found throughout Australia, but by the time European explorers arrived, they were limited to Tasmania. The last known thylacine died in a zoo in 1936.

The genome of a preserved thylacine was first sequenced in 2017 using tissue from a then-108-year-old thylacine pouch preserved in alcohol. However, this genome was far from complete, with many gaps. Now Colossal, which also aims to recreate the woolly mammoth, says it has largely completed this genome with the help of additional DNA from a 120-year-old tooth.

“Our genome is not as complete as the most complete human genome, but we were able to take advantage of some of the same technologies,” says Andrew Pask at the University of Melbourne in Australia, a member of Colossal’s scientific advisory board.

It is difficult to completely sequence the genomes of plants and animals because there are large sections where the same sequences are repeated many times. Standard techniques that sequence small segments of DNA at a time don’t work for these parts – it is like trying to reassemble a book from a list of the words in it.

Newer, long-read techniques can sequence much larger segments of DNA – whole pages of the book. However, old DNA usually breaks up into lots of small pieces, so these methods don’t often help.

“Most ancient samples preserve DNA fragments that are on the order of tens of bases long – hundreds if we are lucky,” says Pask. “The sample we were able to access was so well preserved that we could recover fragments of DNA that were thousands of bases long.”

Given the lack of any other thylacine genomes to make a comparison with, there is no direct way to tell how complete it is – instead Pask says Colossal is using other related species in the same family to make this estimate.

But even if the genome is as complete as Colossal thinks and it really can fill in the remaining gaps, there is currently no feasible way to generate living cells containing this genome. Instead, Colossal plans to genetically modify a living marsupial called the fat-tailed dunnart to make it more like a thylacine.

“It’s more a recreation of some traits,” says Mármol-Sánchez. “It would not be an extinct animal, but a pretty weird, modified version of the modern animal that resembles our image of those extinct animals.”

Colossal says it has made a record 300 genetic edits to the genomes of dunnart cells growing in culture. So far, all are small changes, but Pask says the team plans to swap in tens of thousands of base pairs of thylacine DNA in the near future. It isn’t yet clear how many edits will be required to achieve the company’s goal of recreating the thylacine, he says.

When asked why Colossal had provided no evidence in support of its claims, CEO Ben Lamm said the company’s sole focus is de-extinction, not writing scientific papers. “We are not an academic lab where papers are their main focus,” said Lamm. “We will continue to make progress much faster than the process of writing scientific papers.”

Topics:

Upgrades That Make Your Truck Go Further on Less Fuel

0
Upgrades That Make Your Truck Go Further on Less Fuel

Trucks are awesome. Short of owning a tank, few vehicles make you feel like you are controlling a massive beast. Perhaps it’s in our genes to love the idea of riding something that’s far more powerful than us. However, there is one sore point that leaves truck owners feeling a little down, its poor mileage. 

Upgrades That Make Your Truck Go Further on Less Fuel

It’s often viewed as the price that has to be paid for the power you enjoy, but is that true? Is there no way to combine power and good fuel efficiency in the same vehicle? Let’s find out. 

Start at the Source: The Engine

At the heart of every truck is a powerful engine. Poor mileage is a direct consequence of the extra fuel needed to fire each cylinder. While there is no way to magically change the laws of physics to get more mileage, there are things you can do to get your engine to be more efficient. 

The first step would be proper maintenance. It’s a common enough situation: a vehicle runs fine for the first few years but starts to perform poorly after some time. You may run into startup issues, vibrations, and yes, even a reduction in mileage. 

When the truck is taken to a shop and checked out, poor maintenance is often the most common issue noticed. This can impact mileage more than you think. For instance, if the air filters have gotten clogged up with months or years’ worth of dust and gunk, it leads to a terrible air-fuel mixture. 

The result? Incomplete combustion of fuel or wasted potential mileage. 

Similarly, the fuel injector could be clogged, and there might be overheating issues caused by a poorly maintained cooling system. All these have the potential to wreak havoc on your mileage. 

Consider New Engine Mods

Sometimes, your truck’s mileage might be poor, even if you have been maintaining it well. Does that mean you need to give up? Not at all. There are a number of aftermarket modifications that can boost not just your mileage but other performance parameters as well. 

Many truck owners choose to use superior fuel injection systems with 5th Gen Cummins parts. Similarly, performance chips and tuners, aerodynamic adjustments, and exhaust system upgrades can all work together to improve mileage. 

However, as Diesel Power Products notes, if you are going to be increasing power and performance, your truck may need adjustments in other aspects to ensure compatibility. That’s just something to keep in mind if you were about to order a bunch of aftermarket mods. 

Take the time to research your vehicle and what mods can be installed without too much hassle. The last thing you want is to purchase something and find out you need to change five other parts in your vehicle for it to work. 

Don’t Sleep on the Tire Upgrades

For some reason, people underestimate the impact that tires have. In many cases, a better set of tires will give you a far greater boost than you might get playing around with a chip tuner.

According to the U.S. Department of Energy, up to 30% of your vehicle’s fuel consumption is tire-related. This is because rolling resistance is an extremely powerful factor when we are talking about mileage. If your tires are worn out, have uneven wear, or are simply improperly inflated, it can lead to greater rolling resistance. 

This means your engine has to use more power to move. You might be surprised at how useful a new set of low-rolling resistance or LLR tires can be. The latest ones also use advanced rubber compounds that reduce heat generation, wear, and energy loss. 

If you are worried about using LLR tires in rain and snow, you don’t need to worry. There are even all-season LLR options you can use. 

In conclusion, if you own a large truck, you probably know that the mileage isn’t going to be its best selling point. These are vehicles made with power and not economy in mind. That said, this fact doesn’t mean you don’t have the freedom to make the most of every gallon of gas. 

Considering the prices, we have to deal with today, maximizing fuel efficiency is one of the more sensible points to focus on. Thankfully, there are several ways to tinker with your truck to increase performance. 

It’s worth noting that a single upgrade or modification may not make much of a difference. However, when you start addressing multiple problem points, you will quickly see a nice jump in the number of miles you get per gallon.

Read More

Delete Face and Move Face in SolidWorks

0
Delete Face and Move Face in SolidWorks

Surface modelling is a powerful way of using SolidWorks to create 3D models but it can be daunting for users who used to solid modelling.

As a stepping stone to full surface modelling there are two ‘pseudo-surfacing’ tools which are often overlooked. These are the Move Face and the Delete Face – both with quite self-explanatory names – and they are more like standalone features that can be very useful in all sorts of different situations.

The Delete Face Tool

As the name suggests, this tool deletes faces, but it also allows you to easily patch or fill faces.
The Delete Face tool can be found on the Surfaces tab (or Insert>Face>Delete) and after opening presents a fairly simple set of options.

Delete Face and Move Face in SolidWorks

First, the face to be deleted is selected, but we also have three different options – Delete, Delete and Patch or Delete and Fill.

Selecting Delete option simply removes the chosen face and leaves the body open. For example, if we have a solid cube and delete the top face we’re then left with an open surface body. Note that the previously solid body has turned into a surface body, because a collection of surfaces can’t be solid unless it is fully enclosed.

 

how to delete face

 

However, the real utility of this tool comes with the ability to patch or fill the deleted faces. For example, you may have been sent a part as shown below and been asked to remove or change the internal fillet.

delete face and patch solidworks

 

 

Depending on how the part has been made this may be a trivial change, but if the fillet hasn’t been modelled in a certain way then adjusting or removing it may cause issues later in the model’s feature tree. You may have also been given a third-party file, like a STEP file, which doesn’t give you access to the model’s features. In these cases we can use the Delete Face tool.

By using the Delete and Patch option we can select the fillet faces and patch the remaining gap, resulting in a straight edge that can then be used to add a new, differently-sized fillet, or can be left as a straight edge.

 

 

delete face: detele and patch

 

When using Delete and Patch the adjoining surfaces are extended to a form straight edge.

 

Delete and Patch the adjoining surfaces in Solidworks

The Delete and Fill is slightly different in that it replaces the multiple fillet faces with one single surface. Selecting the Tangent Fill option ensures that this new face joins the surrounding faces in a suitable way.

 

Tangent Fill Solidworks

without tangent fill

The Delete Face tool can also be useful removing holes and openings. For example, in a normal SolidWorks model the hole feature below could easily be deleted or suppressed, but if the model is an imported third-party file then you might not be able to directly edit those features. Or, in other cases perhaps feature elsewhere in the model depends on the hole, so we don’t want get rid of the feature itself, we just want to fill the hole.

In many cases you could fill this hole manually using features like Extruded Boss/Base and Up To Surface end conditions but actually it’s easier – especially when working with curved faces – to use Delete Face.
Simply select Delete and Patch, then select the inner face of the hole. This will delete the selected face and patch the two remaining large faces as shown below.

 fill hole manually using features like Extruded Boss/Base

Some further examples of the Delete and Fill and Delete and Patch options are shown below.

Delete and Fill and Delete and Patch

It should be noted that the automatic patching or filling doesn’t always work. When working with complex geometry patches may fail, or result in undesirable strange-shaped surfaces. In those cases it’s best to use the basic Delete option and then manually create a patch using a feature like a Boundary Surface.

The Move Face Tool

The Move Face tool is quite a similar tool that can also be used in all types of modelling. This tool allows you to move, offset or rotate faces easily.  For example, you may have a STEP file – similar to the bracket below – which has no editable SolidWorks features and so isn’t very easy to make changes that otherwise would be pretty simple.

 

Move Face in STEP file

 

The Move Face isn’t on the Surfaces toolbar but rather can be found under the Insert > Face > Move
After opening the tool we’re presented with a selection box here and three options –Offset, Translate (move), or Rotate.
When Offsetting faces are moved inwards or outwards by a set amount. This can be used to make the entire part larger or smaller, or simply to move specific faces.

 

Move Face Offset STEP File Solidworks

 

Offset is also useful for adjusting hole sizes. The inner faces of holes are selected and the Offset amount is set, making them larger or smaller.

 

Move face offset menu solidworks

 

The next option is Translate and this allows you to move faces in a specified direction. Select a face, then move it in the X, Y or Z direction, by adjusting the values in the property manager or by dragging the orange arrows in the graphics area.

 

Move face translate in solidworks

 

As well as X and Y and Z directions, specific directions can also be used by selecting edges or axes.

 

Move face translate edge solidworks

 

Finally, faces can also be rotated. The direction of rotation, and the origin point of the rotation can be adjusted in the property manager, or the rotation arrows in the graphics area can be used. When a face is selected, the rotation point will be in the centre of that face by default.

 

Move Face Rotate Solidworks

 

You can also choose an edge or an axis to rotate around, then set a specified angular rotation.

 

Move face rotate edge in solidworks

 

Design Intent

When using these two useful tools, especially Move Face be aware of your design intent. If your part is mated in an assembly, for example, and then the Move Face command used on one of the mated faces, then the mate in your assembly will use the new position of the face, not the original position.

Any features that reference the moved faces will also be affected. So, for example – if a hole is 80mm from this face and the face is moved, then the hole will remain 80mm from the new position but will have changed its relative position.

Delete Face & Move Face Top Tips

  • The Delete face and Move Face tools are quite simple tools but can be extremely useful.
  • Delete and Patch extends the surfaces to give a straight join
  • Delete and Fill replaces the previous surfaces will one single, new surface
  • Move Face is especially useful when working with imported parts that have no feature tree
  • It can be used to remove holes and openings, adjust fillets and much more

More about SOLIDWORKS:


solidworks tips expertAbout the Author: This is a guest post by Johno Ellison, a design engineer with over fifteen years or experience, who specializes in SolidWorks 3D CAD modeling. Johno is the author of the following online SolidWorks courses:
Master SolidWorks 2021 – 3D CAD using real-world examples
Master SolidWorks 2019 – 3D CAD using real-world examples
Master SolidWorks 2018 – 3D CAD using real-world examples

Check out his latest SOLIDWORKS course – SolidWorks SURFACING Fundamentals, rated 4.7 stars on Udemy!

Brookhaven’s Computing Center Reaches 300 Petabytes of Stored Data

0
Brookhaven’s Computing Center Reaches 300 Petabytes of Stored Data

Largest compilation of nuclear and particle physics data in the U.S., all easily accessible — with plans for much more

Brookhaven’s Computing Center Reaches 300 Petabytes of Stored Data

Part of the team that manages the tape storage library at the Scientific Data and Computing Center of the U.S. Department of Energy’s Brookhaven National Laboratory, left to right: Qiulan Huang, Tim Chou, Alexei Klimentov, Ognian Novakov, Joe Frith, Shigeki Misawa. (David Rahner/Brookhaven National Laboratory)

UPTON, N.Y. — The Scientific Data and Computing Center (SDCC) at the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory has reached a major milestone: It now stores more than 300 petabytes of data. That’s far more data than would be needed to store everything written by humankind since the dawn of history — or, if you prefer your media in video format, all the movies ever created.

“This is the largest tape archive in the U.S. for data from nuclear and particle physics (NPP) experiments, and third in terms of scientific data overall*,” said Brookhaven Lab physicist Alexei Klimentov, who manages SDCC.

“Our 300 petabytes — or 300 million gigabytes — would equal six or seven million movies,” said Brookhaven Lab engineer and data specialist Tim Chou. “Since the first movie was made in 1888, humans have generated some 500,000 movies. So, all the feature films ever created would fill only a small percentage of our storage.”

Written history, starting from Sanskrit to today, would fill just 50 petabytes. “We have six times more data,” Klimentov said.

The current SDCC cache comes from experiments at the Relativistic Heavy Ion Collider (RHIC), a DOE Office of Science user facility for nuclear physics research that’s been operating at Brookhaven Lab since 2000, and the ATLAS experiment at the Large Hadron Collider (LHC), located at CERN, the European Organization for Nuclear Research. These two colliders each smash protons and/or atomic nuclei together at nearly the speed of light, thousands of times per second, to explore questions about the nature of matter and fundamental forces. Detectors collect countless characteristics of the particles that stream from each collision and the conditions under which the data were recorded.

“And amazingly, every single byte of this data is online. It’s not in bolted storage that is not available,” Chou said. “Collaborators around the world can access it, and we will mount it and send it back to them.”

Seamless access on demand

By mounting it, he means pulling the relevant information out of a state-of-the-art, high-tech tape storage library. When RHIC and ATLAS collaborators want a particular dataset — or multiple sets simultaneously — an SDCC robot grabs the appropriate tape(s) and mounts the desired data to disk within seconds. Scientists can tap into that data as if it were on their own desktop, even from halfway around the world.

“We have data available on demand,” Klimentov said. “It’s stored on tape and then staged on disk for physicists to access when they need it, and this is done automatically. It’s really a ‘carousel of data,’ depending on what RHIC or ATLAS physicists want to analyze.”

SDCC engineer Yingzi (Iris) Wu noted that the system requires very good monitoring, some redundancy of data and control paths, and good support of the equipment.

“We have developed our own software and website to generate plots that let us know what is going on with the data transfers,” she said. “And we added a lot of capabilities for monitoring how the data goes into and out of the High Performance Storage System (HPSS).”

HPSS is a data-management system designed by a consortium of DOE labs and IBM to ensure that components of complex data storage systems — tapes, databases, disks, and other technologies — can “talk” to one another. The consortium developed the software physicists use to access SDCC’s data.

“We install and configure the software and let users use it,” Wu said, “but we always need to improve the way to talk to those systems to get data in and out. Our log system has alerts. If anything is not performing well, it will send out alerts to the team,” she said.

Artificial intelligence (AI) and machine learning (ML) algorithms can help detect such anomalies and reduce the operational burden on the computing professionals and engineers who provide support for users of HPSS and SDCC’s storage systems, Klimentov noted. “This is something SDCC staff plan to develop more over the next few years to meet future data demands,” he said.

Energy and cost savings

Why use such a complex two-tiered tape-to-disk system? The answer is simple: cost.

“When you consider the cost per terabyte of storage, tape is four or five times less expensive than disk,” said SDCC engineer Ognian Novakov.

In addition, for data to be available on disk, the disks have to spin in computers, eating up energy and emitting heat — which further increases the energy needed to keep the computers cool. Tape, which is relatively static when not in use, has lower power demands.

“Tape storage is generally designed for deep storage, deep archives. You write the data and almost never read it, unless you need it for recovery or to meet compliance requirements,” Novakov said.

“But in our case, it’s a very dynamic archive,” he said. The robots frequently access the tape archive to move/stage requested data to disk, then the staged data gets deleted from disk so there’s space for the next request. “HPSS plus our tape libraries provide the functionality of an infinite file system,” Novakov said.

Klimentov noted that the more efficient storage has allowed SDCC to reduce the data volume on disk “by a factor of two.”

Cutting down on disks has another benefit since they have an average lifetime of just five years, compared to tape with a shelf life of about 30 years, Chou said.

And tape capacity keeps improving.

“The storage capacity on tape generally doubles every four to five years,” Novakov said. “We started 26 years ago with 20-gigabyte tape cartridges; now we are at 18 terabytes on one cartridge — and it’s even smaller in physical size.” By periodically rewriting data from older media to new, “we are freeing a lot of slots in the library,” he said.

Meeting ever-increasing data demands

Most of the SDCC’s tape libraries are now located in a facility with power and cooling efficiencies designed specifically for data systems. And there should be enough room for expansion to meet the ever-increasing demand of current experiments as well as those planned for the future.

“RHIC’s newest detector, sPHENIX, with a readout rate of 15,000 events per second, is projected to more than double the data we have now,” said Chou.

After RHIC has completed its science mission toward the end of 2025, it will be transformed into an Electron-Ion Collider (EIC). This new nuclear physics facility is currently in the design stage at Brookhaven and is expected to become operational in the 2030s. Around the same time, a “high-luminosity” upgrade to increase collision rates at the LHC is expected to ramp up the ATLAS experiment’s data output by about ten times!

Plus, SDCC handles smaller data loads for a few other experiments, including the Belle II experiment in Japan, the Deep Underground Neutrino Experiment based at DOE’s Fermi National Accelerator Laboratory, and some experiments at the National Synchrotron Light Source II and Center for Functional Nanomaterials, two other DOE Office of Science user facilities at Brookhaven.

“Space wise, we probably have to grow our physical capacity to one-and-a-half or two times our current size [by adding more racks to the existing facility], while in data capacity, we are growing by a factor of 10 or more,” Novakov said.

Chou has spec’d it out: “From our calculations, our existing tape room can probably hold 1.5 or 1.6 exabytes of data with existing old technology. One exabyte is 1,000 petabytes — a billion billion bytes,” he said. “But we know the capacity of tape technology will grow exponentially. We think with technology upgrades, we can hold three exabytes without major upgrades to our facility.”

AI-enabled analysis — in quasi-real-time

Adding to the challenge is that physicists are now inclined to record more of the data collected by experiments.

“Before, much of the data was filtered by triggers that decided based on certain criteria which collision events to store and which to discard — because we couldn’t keep it all,” Klimentov noted. But now, sPHENIX and the future EIC experiment(s) plan to stream all their data to SDCC and use AI/ML algorithms for data noise suppression. Keeping raw data will ensure that future analyses can access characteristics that selective triggers might have discarded. Scientists could also deploy AI algorithms to mine through unfiltered data to detect patterns, potentially making unforeseen discoveries.

“At the same time, as computing is getting faster and faster,” Klimentov said, “we have managed, at least for LHC, to analyze data in quasi-real time — with a delay of just 56 hours — which was impossible 20 years ago. The reliability of our system — and the people who operate it, who are very important — makes that possible.

“That means we can detect anomalies in practically real time, including anomalies in accelerator and detector performance, while the accelerator and experiments are operating,” he said.

Such real-time AI-enabled analysis of data could guide corrective actions to minimize accelerator/detector downtime or change the way those systems are operating. It also has the potential to alert physicists to something in the data that’s worth a closer look discovery-wise.

EIC’s dual data centers

Handling data for the EIC will present other challenges. Since this new facility is being built in partnership with DOE’s Thomas Jefferson National Accelerator Facility (Jefferson Lab) in Newport News, Virginia, both Jefferson Lab and Brookhaven plan to keep a full record of all the facility’s data.

“For the EIC, we are expecting to collect about 220 petabytes of data per year at nominal collision rates,” Klimentov said. “We are working with Jefferson Lab on how we will organize this data among the two Labs.”

“When you increase your archive and anomaly detection and analysis functioning, you need to also increase monitoring and alarming systems,” he said.

Because Brookhaven will be the site of the collider, he expects some data handling, such as “noise suppression” and pre-filtering, to be done on site before data is shipped and archived. Some of this processing may even take place in the “counting house” computing systems immediately adjacent to the EIC detector, before making its way to SDCC and Jefferson Lab.

“Fortunately, we have junior-generation physicists who have started to learn about these challenges using ATLAS and RHIC data,” Klimentov said.

sPHENIX, with an expected output of 565 petabytes of data and two separate data streams — one going to tape and one going to a disk cache for immediate processing to ensure all detectors are working — is giving them lots of practice.

“I see sPHENIX as a nice steppingstone for a streaming model so that these young physicists, computing professionals, and engineers can learn,” Klimentov said, “and they will eventually work for the EIC.”

SDCC operations are funded by the DOE Office of Science.

* The top spots for U.S. stores of scientific data go to the National Energy Research Scientific Computing Center (NERSC) and the National Oceanic and Atmospheric Administration (NOAA), each with 355 petabytes.

Brookhaven National Laboratory is supported by the Office of Science of the U.S. Department of Energy. The Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit science.energy.gov.

Follow @BrookhavenLab on social media. Find us on Instagram, LinkedIn, X, and Facebook.