Brookhaven physicist Qiang Li (right) and materials scientist Lijun Wu (left) in an electron microscopy lab at Brookhaven National Laboratory. Photo: Brookhaven National Laboratory.
Brookhaven physicist Qiang Li (right) and materials scientist Lijun Wu (left) in an electron microscopy lab at Brookhaven National Laboratory. Photo: Brookhaven National Laboratory.

Compared with conventional superconductors, which must be chilled to near absolute zero (-273°C) to release their superconducting powers, iron-based superconductors can conduct electricity without resistance at unusually high temperatures. They are also relatively easy to bend and can carry electrical current efficiently in different directions under high magnetic fields before superconductivity becomes suppressed.

These characteristics make iron-based superconductors promising for various advanced energy applications, from electric vehicles to wind turbines to medical imaging devices. However, the amount of current they can carry has always been low in comparison to other superconducting materials. Scientists have been trying to increase this amount, but doing so has often come at the cost of degrading another key superconducting property – the critical temperature at which superconductivity emerges.

Now, a team of scientists has come up with a way to double the amount of electrical current an iron-based material can carry without losing its superconducting properties, while increasing the material's critical temperature. Their method, reported in a paper in Nature Communications, makes use of a readily-available technology that could be used to improve the performance of superconducting wires and tapes.

"By bombarding the material with protons at low energy, we simultaneously increased the material's current-carrying capacity and critical temperature – the first time that this has been possible in an iron-based superconductor," said physicist Qiang Li, head of the Advanced Energy Materials Group at the US Department of Energy's (DOE) Brookhaven National Laboratory, who directed the research. "There are low-energy ion facilities all over the world that can do this inexpensively."

Shooting ions at superconducting materials is a well-established method for artificially introducing defects in the crystal structure of these materials to increase the amount of current they can carry. These defects ‘pin’ in place, or trap, the microscopic magnetic vortices that form when the superconductor is placed in a strong magnetic field, such as those generated by magnets in magnetic resonance imaging (MRI) machines. If free to move around, these vortices dissipate energy, thereby destroying the superconductor's ability to carry electrical current without energy loss.

Unfortunately, if the defects produced by ion irradiation create too much disorder in the superconductor, they lower its critical temperature. Because superconductors currently require extreme cooling to conduct electricity without resistance, any decrease in the critical temperature is undesirable.

In a Goldilocks sense, the defects must be ‘just right’ – not too big, not too small. Scientists can design such defects by selecting the appropriate ion species (e.g. protons, gold ions, argon ions) and ion energy.

"Some ions or energies may cause large enough damage to interfere with superconductivity, while others may not produce any effect at all," explained co-author Toshinori Ozaki, a former scientist in Brookhaven Lab's Advanced Energy Materials Group who is now a faculty member at Kwansei Gakuin University in Japan. "So we run simulations to figure out what combination should produce the optimal defect – one that can hold down the magnetic vortices without negatively impacting the material's superconducting properties."

In the case of the iron-based material the team studied, low-energy protons did the trick. Using electron microscopes, the scientists took images of thin films (about 100nm thick) of the iron-based material, which they prepared themselves, before and after they hit the films with low-energy protons.

"Throughout the irradiated films, we saw individual chains of defects created by the collisions between the incident ions and nucleus that broke the perfect atomic order, causing the lattice to locally compress or stretch out," said co-author Lijun Wu, a materials scientist at Brookhaven who led the microscopy work.

In a performance check, the scientists measured the amount of electrical current running through the films in low and high magnetic fields. In high fields, the irradiated films carried more current than any low-temperature superconductor. This result suggests that the defects and local lattice strain must be strongly pinning the magnetic vortices that would otherwise impede the flow of current.

To figure out what caused the surprising increase in the critical temperature, the team created a strain ‘map’ that laid out where the locally compressed and tensile (stretched out) regions were located. Previous studies have already revealed that the type of lattice strain correlates with the critical temperature: highly compressed regions correlate with a high critical temperature, and highly tensile regions correlate with a low critical temperature. In the team's films, the compressive regions took up much more space, leading to an overall rise in critical temperature.

According to Li, these performance gains are only the beginning: "We believe that the critical current and temperature can be further enhanced by fine tuning the structure of the defects and the arrangement of the lattice strains."

This story is adapted from material from Brookhaven National Laboratory, with editorial changes made by Materials Today. The views expressed in this article do not necessarily represent those of Elsevier. Link to original source.