Husband & Dad, likes Python, good fiction & more
76 stories
·
0 followers

ALPHA experiment takes antimatter to a new level

1 Share

In a paper published today in the journal Nature, the ALPHA collaboration reports that it has literally taken antimatter to a new level. The researchers have observed the Lyman-alpha electronic transition in the antihydrogen atom, the antimatter counterpart of hydrogen, for the first time. The finding comes hot on the heels of recent measurements by the collaboration of another electronic transition, and demonstrates that ALPHA is quickly and steadily paving the way for precision experiments that could uncover as yet unseen differences between the behaviour of matter and antimatter.

The Lyman-alpha (or 1S-2P) transition is one of several in the Lyman series of electronic transitions that were discovered in atomic hydrogen just over a century ago by physicist Theodore Lyman. The transition occurs when an electron jumps from the lowest-energy (1S) level to a higher-energy (2P) level and then falls back to the 1S level by emitting a photon at a wavelength of 121.6 nanometres.

It is a special transition. In astronomy, it allows researchers to probe the state of the medium that lies between galaxies and test models of the cosmos. In antimatter studies, it could enable precision measurements of how antihydrogen responds to light and gravity. Finding any slight difference between the behaviour of antimatter and matter would rock the foundations of the Standard Model of particle physics and perhaps cast light on why the universe is made up almost entirely of matter, even though equal amounts of antimatter should have been produced in the Big Bang.

The ALPHA team makes antihydrogen atoms by taking antiprotons from CERN’s Antiproton Decelerator (AD) and binding them with positrons from a sodium-22 source. It then confines the resulting antihydrogen atoms in a magnetic trap, which prevents them from coming into contact with matter and annihilating. Laser light is then shone onto the trapped atoms to measure their spectral response. The measurement involves using a range of laser frequencies and counting the number of atoms that drop out of the trap as a result of interactions between the laser and the trapped atoms.

The ALPHA collaboration has previously employed this technique to measure the so-called 1S-2S transition. Using the same approach and a series of laser wavelengths around 121.6 nanometres, ALPHA has now detected the Lyman-alpha transition in antihydrogen and measured its frequency with a precision of a few parts in a hundred million, obtaining good agreement with the equivalent transition in hydrogen.

This precision is not as high as that achieved in hydrogen, but the finding represents a pivotal technological step towards using the Lyman-alpha transition to chill large samples of antihydrogen using a technique known as laser cooling. Such samples would allow researchers to bring the precision of this and other measurements of antihydrogen to a level at which any differences between the behaviour of antihydrogen and hydrogen might emerge.

“We are really excited about this result,” says Jeffrey Hangst, spokesperson for the ALPHA experiment. “The Lyman-alpha transition is notoriously difficult to probe – even in ‘normal’ hydrogen. But by exploiting our ability to trap and hold large numbers of antihydrogen atoms for several hours, and using a pulsed source of Lyman-alpha laser light, we were able to observe this transition. Next up is laser cooling, which will be a game-changer for precision spectroscopy and gravitational measurements.”

Read the whole story
tbjohnston
29 days ago
reply
Seattle
Share this story
Delete

Ancient Quasars Provide Incredible Evidence for Quantum Entanglement

1 Share

Using two ancient galactic cores called quasars, researchers have taken a massive step forward toward confirming quantum entanglement — a concept that says that the properties of particles can be linked no matter how far apart in the universe they may be.

If quantum entanglement is valid, then a pair of entangled particles can exist billions of light-years apart from one another and actions affecting the properties of one particle will affect the properties of the other particle. Albert Einstein described this correlation between particles as "spooky action at a distance." Last year, physicists from MIT, the University of Vienna and other institutions provided strong evidence for quantum entanglement, and now, this same team of scientists has gone even further to confirm quantum entanglement.

Scientists looking to prove quantum entanglement have to show that measured correlations between particles cannot be explained by classical physics, according to a statement from MIT describing the new work. In the 1960s, physicist John Bell calculated a theoretical limit, past which correlations between particles must have a quantum, not a classical, explanation. [Time Crystals to Tetraquarks: Quantum Physics in 2017]

The distant quasar B1608+656 is smeared into bright arcs by two closer galaxies in the foreground. Researchers have used two ancient quasars, which emitted their light billions of years ago, to provide evidence for quantum entanglement.

Credit: ESA/Hubble, NASA, Suyu et al.

But there are loopholes in this theoretical limit, in which observations of what seem to be correlated particles have a hidden, classical explanation, the MIT researchers said. One of these loopholes that scientists are working to close is known as the "freedom-of-choice" loophole, or the possibility that an unknown classical influence is affecting a measurement of an entangled particle. With this loophole, researchers observe a quantum correlation when there is none.

Last year, this team of scientists demonstrated, using 600-year-old starlight, that if the correlations they observed between particles could be explained by classical physics, this classical origin would have to stem from more than 600 years ago — before the star's light ever shone.

To close this loophole even further, these researchers have now used distant, ancient quasars — luminous, energetic galactic nuclei — to see if the correlation between particles can be explained by classical mechanics stemming from earlier than 600 years ago. In other words, they're taking the success of their study from last year and scaling it up to provide further evidence for quantum entanglement.

This Ancient Shrimp-Like Critter Was a Jumbo Killing Machine

01:34

Next Up: This Ancient Shrimp-Like Critter Was a Jumbo Killing Machine

Live

00:04

01:55

02:00

Your Recommended Playlist

Your Recommended Playlist

01:34

This Ancient Shrimp-Like Critter Was a Jumbo Killing Machine

01:32

Is Pumpkin (Everything) Good for You?

01:22

Why Does Alcohol Make You Sleepy ... Then Alert?

01:29

This Weird Skeleton Is Definitely Not an Alien

02:48

No Eyes, No Problem! MIT Robot Runs and Climbs Vision-Free

01:38

Why Do We Dress Up and Trick or Treat on Halloween?

To do this, they chose to use two quasars that emitted light 7.8 billion years ago and 12.2 billion years ago. The researchers used light from these two quasars to determine the angle at which to tilt a polarizer, which measures the orientation of each photon's electric field.

They used telescopes located at detectors to measure the wavelength of the entangled photons (light particles) in the light coming from the quasars. If the light was redder than a reference wavelength — a measurement used for comparison that is taken at a different wavelength than those being studied — the polarizer tilted to measure the incoming photon. If the light was bluer than the reference wavelength, the polarizer would tilt to a different angle to measure the photon.

In the study performed last year, researchers used small telescopes that only allowed them to measure light from stars 600 light-years away, but by using larger, more powerful telescopes, the researchers have now managed to measure the light from much older, more distant quasars.

In studying entangled photons with these ancient quasars, the team found correlations in over 30,000 pairs of photons. These correlations went well beyond the limit set by Bell, showing that, if there were any classical explanation for the correlated particles, it would have to come from before these ancient quasars emitted light — many billions of years ago.

"If some conspiracy is happening to simulate quantum mechanics by a mechanism that is actually classical, that mechanism would have had to begin its operations — somehow knowing exactly when, where, and how this experiment was going to be done — at least 7.8 billion years ago," Alan Guth, a physicist at MIT and a co-author of the new work, said in the statement. "That seems incredibly implausible, so we have very strong evidence that quantum mechanics is the right explanation."

So, with these findings, it is "implausible" that the measured correlations have a classical explanation, the researchers said. This is strong evidence that quantum mechanics caused this correlation and that quantum entanglement is valid, they said.

"The Earth is about 4.5 billion years old, so any alternative mechanism — different from quantum mechanics — that might have produced our results by exploiting this loophole would've had to be in place long before even there was a planet Earth, let alone an MIT," David Kaiser, also a physicist at MIT and a co-author of the study, added in the statement. "So we've pushed any alternative explanations back to very early in cosmic history."

The work was published Aug. 20 in the journal Physical Review Letters.

Email Chelsea Gohd at cgohd@space.com or follow her @chelsea_gohd. Follow us @Spacedotcom, Facebook and Google+. Original article on Space.com

Read the whole story
tbjohnston
29 days ago
reply
Seattle
Share this story
Delete

Any Weight Loss Can Be Healthful, but More Can Be Much Better

1 Share
Overweight people who lost 5 to 10 percent of their weight lowered their risk for metabolic syndrome by 22 percent. Those who lost 20 percent cut their risk by over 50 percent.
Read the whole story
tbjohnston
37 days ago
reply
Seattle
Share this story
Delete

Focal Points (#135) | Friday Forward

1 Share

Last week, our leadership team participated in a two-day planning session that we do every quarter. At these off-sites, we lay out our most important objectives – as a company and individually — for the next three months.

Each team member also establishes rocks/goals for the quarter. At this quarter’s off-site, our new coaches really pushed us to only select and focus on three goals each, rather the five-to-six we’ve been accustomed to creating in the past.

Based on their extensive experience and numerous studies, it turns out that when we set too many goals we actually complete fewer of them.

There is a great story about Warren Buffett related to this that conveys the importance of focus. As the story goes, Buffett heard his personal airline pilot, Mike Flint, talking about his long-term goals and priorities. After he was done, Buffett suggested to Flint that he conduct the following exercise:

  • Step 1: Write down your top 25 career goals on a single piece of paper.
  • Step 2: Circle only your top five options.
  • Step 3: Put the top five on one list and the remaining 20 on a second list.

When Flint commented that he would continue to work on the second list intermittently, Buffett interjected, saying “No. You’ve got it wrong, Mike. Everything you didn’t circle just became your Avoid-At-All-Cost list. No matter what, these things get no attention from you until you’ve succeeded with your top five.”

The implication is that the other 20 goals would distract him from accomplishing his top five most important goals. It’s the same thing if we set too many rocks (goals) for ourselves and our company each quarter.

There is a significant difference between being busy and getting a lot done versus achieving at a high level. For example, many of us start the day with a list of 25 things to do and then end that day with many of the most salient items still on the list. We pass over what is most important in favor of accomplishing things that are quicker or easier to do.

Carrying out Buffett’s process as an individual, team or company is a valuable exercise. In addition, here are a few tips to help you and your team stay focused and accomplish your most important, long-term goals.

  1. Recognize that time is a precious and fixed resource.
  2. Identify what’s Urgent from what’s Important.
  3. Align your top priorities with your core purpose and core values.
  4. Don’t book 100% of your time; schedule in rest and relaxation.
  5. Pay attention to things you should stop doing.
  6. Be selective about who you give your energy to.

The other key to goal-setting is making them SMART (specific, measurable, achievable, relevant and timely). For example, “run a long distance” is not a SMART goal; “Completing a half-marathon by January 2020” is. Getting SMART and getting focused will help you finish 2018 strong.

Quote of The Week

Zig Ziglar

(Visited 171 times, 119 visits today)

Read the whole story
tbjohnston
50 days ago
reply
Seattle
Share this story
Delete

The Rest of the Story

1 Share

Is NT really new technology?

When Microsoft released the first version of Windows NT in April 1993, the company's marketing and public relations campaign heavily emphasized the NT (i.e., New Technology) in the operating system's (OS's) name. Microsoft promoted NT as a cutting-edge OS that included all the features users expected in an OS for workstations and small to midsized servers. Although NT was a new OS in 1993, with a new API (i.e., Win32) and new user and systems-management tools, the roots of NT's core architecture and implementation extend back to the mid-1970s.

And now...the rest of the story: I'll take you on a short tour of NT's lineage, which leads back to Digital and its VMS OS. Most of NT's lead developers, including VMS's chief architect, came from Digital, and their background heavily influenced NT's development. After I talk about NT's roots, I'll discuss the more-than-coincidental similarities between NT and VMS, and how Digital reacted to NT's release.

A Brief History of NT
NT's history is closely

tied to that of David N. Cutler, NT's chief architect. After graduating from Michigan's Olivet College in 1965, Cutler worked for DuPont. Although computers weren't his first interest, he ran simulations on Digital machines as part of his job at DuPont. Before long, Cutler was knowledgeable about software and decided he wanted to develop OSs rather than application software. He joined Digital in 1971 and worked at Digital's famous "Mill" facility in Maynard, Massachusetts, developing OSs for the PDP-11 family. RSX-11M is the first OS in which Cutler incorporated major concepts and design principles that later surfaced in NT. RSX-11M is a PDP-11 OS Digital developed for industrial and manufacturing control.

In 1975, Digital realized that its competitors were developing 32-bit processors and that this technology would lure customers away from PDP's 16-bit architecture. Gordon Bell, a legendary figure in computer history and then vice president of engineering for Digital, drove the development of the 32-bit processor, which Digital eventually named VAX. By this time a star within Digital, Cutler was part of the initial VAX development team. Digital had charged Cutler, along with Dick Hustvedt and Peter Lipman, with designing VAX's OS, VMS. Digital's primary design goals for VAX hardware included backward compatibility with PDP-11 processors and enough flexibility that VAX could be the basis for low-end desktop workstations as well as enterprise-level servers. Digital also made VMS backward compatible with RSX-11M and designed VMS to run on different size machines. Of this development period, Digital states in its company history that it was "betting the business" on VAX and VMS. In an eerie echo of Digital's statement, Bill Gates recently claimed that Microsoft is "betting the business" on NT 5.0.

In 1977, Digital announced VAX-11/780 and VMS 1.0, making the first product shipments in 1978. As the project leader and one of VMS's main architects, Cutler continued work on successive releases of VMS, but he became restless at Digital. In 1981, Cutler threatened to leave Digital. To retain its star developer, Digital gave Cutler about 200 hardware and software engineers. Cutler moved his group to Seattle and started a development center. This elite group's goal was to design a new CPU architecture and OS that would lead Digital into the 1990s. Digital called the Cutler group's hardware project Prism, and its OS Mica.

In 1988, Digital executives cancelled Cutler's project and laid off many of its group members. Cutler decided to leave Digital, but before he could do so, Microsoft executives learned of the development and realized they had an ideal opportunity to hire Cutler. At the time Cutler left Digital, the release of VMS was version 5.0 (today's version is 7.1).

In August 1988, Bill Gates hired Cutler. One of Cutler's conditions for moving to Microsoft was that he could bring around 20 former Digital employees with him, including several Prism hardware engineers. Microsoft readily met this demand­the company knew hiring an OS architect of Cutler's stature was a coup, and few engineers had Cutler's track record. In addition, Gates felt that Microsoft's long-term future depended on the development of a new OS that would rival UNIX.

Microsoft's internal project name for the new OS was OS/2 NT, because Microsoft's intention was for the new OS to succeed OS/2 yet retain the OS/2 API as its primary interface. The success of Windows 3.0 in April 1990 altered Microsoft's thinking and its relationship with IBM. Six weeks after Microsoft released Windows 3.0, Microsoft renamed OS/2 NT as Windows NT, and designated the Win32 API (a 32-bit evolution of Windows 3.0's 16-bit API) NT's official API. Gates decided that compatibility with the 16-bit Windows API and the ability to run Windows 3.x applications unmodified were NT's paramount goals, in addition to support for portions of the DOS, OS/2, and POSIX APIs. From 1990 to NT's public release in August 1993, Cutler's team was in a mad dash to complete NT, and the project grew to involve more than 200 engineers and testers. Figure 1 shows a timeline of the major events in the history of NT.

TABLE 1: VMS and NT Terminology Translations
VMS Term NT Translation
Interrupt Priority Level (IPL) Interrupt Request Level (IRQL)
Asynchronous System Trap (AST) Asynchronous Procedure Call (APC)
Fork Procedure Deferred Procedure Call (DPC)
I/O Request Packet (IRP) I/O Request Packet (IRP)
Bug Check Bug Check
System Service System Service
sys.exe ntoskrnl.exe
Paged Pool Paged Pool
Nonpaged Pool Nonpaged Pool
Look aside List Look aside List
Section Section

NT and VMS
Most of NT's core designers had worked on and with VMS at Digital; some had worked directly with Cutler. How could these developers prevent their VMS design decisions from affecting their design and implementation of NT? Many users believe that NT's developers carried concepts from VMS to NT, but most don't know just how similar NT and VMS are at the kernel level (despite the Usenet joke that if you increment each letter in VMS you end up with WNT­Windows NT).

As in UNIX and most commercial OSs, NT has two modes of execution, as Figure 2 shows. In user mode, applications execute, and OS/2, DOS, and POSIX execute and export APIs for applications to use. These components are unprivileged because NT controls them and the hardware they run on. Without NT's permission, these components cannot directly access hardware. In addition, the components and hardware cannot access each other's memory space, nor can they access the memory associated with NT's kernel. The components in user mode must call on the kernel if they want to access hardware or allocate physical or logical resources.

The kernel executes in a privileged mode: It can directly access memory and hardware. The kernel consists of several Executive subsystems, which are responsible for managing resources, including the Process Manager, the I/O Manager, the Virtual Memory Manager, the Security Reference Monitor, and a microkernel that handles scheduling and interrupts. The system dynamically loads device drivers, which are kernel components that interface NT to different peripheral devices. The hardware abstraction layer (HAL) hides the specific intricacies of an underlying CPU and motherboard from NT. NT's native API is the API that user-mode applications use to speak to the kernel. This native API is mostly undocumented, because applications are supposed to speak Win32, DOS, OS/2, POSIX, or Win16, and these respective OS environments interact with the kernel on the application's behalf.

VMS doesn't have different OS personalities, as NT does, but its kernel and Executive subsystems are clear predecessors to NT's. Digital developers wrote the VMS kernel almost entirely in VAX assembly language. To be portable across different CPU architectures, Microsoft developers wrote NT's kernel almost entirely in C. In developing NT, these designers rewrote VMS in C, cleaning up, tuning, tweaking, and adding some new functionality and capabilities as they went. This statement is in danger of trivializing their efforts; after all, the designers built a new API (i.e., Win32), a new file system (i.e., NTFS), and a new graphical interface subsystem and administrative environment while maintaining backward compatibility with DOS, OS/2, POSIX, and Win16. Nevertheless, the migration of VMS internals to NT was so thorough that within a few weeks of NT's release, Digital engineers noticed the striking similarities.

Those similarities could fill a book. In fact, you can read sections of VAX/VMS Internals and Data Structures (Digital Press) as an accurate description of NT internals simply by translating VMS terms to NT terms. Table 1 lists a few VMS terms and their NT translations. Although I won't go into detail, I will discuss some of the major similarities and differences between Windows NT 3.1 and VMS 5.0, the last version of VMS Dave Cutler and his team might have influenced. This discussion assumes you have some familiarity with OS concepts (for background information about NT's architecture, see "Windows NT Architecture, Part 1" March 1998 and "Windows NT Architecture, Part 2" April 1998).

TABLE 2: Significant VMS and NT Similarities
VMS NT
Process scheduler implements 32 priority levels split into halves Process scheduler implements 32 priority levels split into halves
Process scheduler never lowers a process' priority below the priority level the application programmed Process scheduler never lowers a process' priority below the priority level the application programmed
Uses boosting to handle CPU hogging Uses boosting to handle CPU hogging
Supports SMP Supports SMP
Digital introduces kernel threads in VMS 7.0 NT 3.1 uses kernel threads
Relies heavily on memory-mapped files Relies heavily on memory-mapped files
Uses demand-paged virtual memory for physical memory management Uses demand-paged virtual memory for physical memory management
Uses working sets with a clock-based replacement algorithm Uses working sets with a clock-based replacement algorithm
Balance Set Manager uses swapping to handle the system's memory demands Balance Set Manager doesn't use swapping
Supports a layered-driver model throughout the device driver stacks Supports a layered-driver model throughout the device driver stacks
Implements asynchronous packet-based I/O commands Implements asynchronous packet-based I/O commands
Represents resources as objects managed by an Object Manager Represents resources as objects managed by an Object Manager
Security subsystem based on objects with access control lists (ACLs) Security subsystem based on objects with ACLs
MONITOR Performance Monitor
BACKUP NT Backup

NT's processes are virtually the same as VMS's processes (Table 2, page 118, shows a comparison of VMS and NT processes). In NT, as in VMS, the process scheduler implements 32 priority levels. The process with the highest priority is always running, and processes with equal priority are scheduled in a round-robin pattern. The system considers the 16 high-priority levels realtime or fixed priorities, because the process scheduler doesn't manipulate priority in processes the system assigns to that range. The 16 low-priority levels (except 0, which the system reserves for the idle thread that executes when nothing else can) are dynamic because the scheduler, often with the input of device drivers, bumps priorities up in reaction to various conditions, such as when the process receives input from a device. This bumping procedure is called boosting. A defining aspect of the NT and VMS schedulers is that they never lower a process' priority below the priority level the application programmed. To handle CPU hogging, in which a process burns CPU cycles without regard to other processes in the system, the scheduler boosts the priority of starved processes that haven't executed for a defined period. Both VMS 5.0 and NT 3.1 schedulers support symmetric multiprocessing (SMP), which let them execute processes simultaneously on different CPUs in order to increase applications' performance.

A major difference between NT process management and VMS process management is that NT processes contain one or more threads of execution, and NT's scheduler gives CPU time to threads, not processes. Digital didn't introduce kernel threads into VMS until version 7.0 in 1995. This addition is one of several enhancements Digital has made to VMS since NT's release that appear to be in response to NT capabilities. In turn, Microsoft added lightweight user-mode threads support to NT 4.0 in 1996, which it copied from the VMS implementation of threads.

The memory managers in NT and VMS are also similar. Both OSs implement virtual memory address maps that the system splits between the currently executing application and the kernel. Both NT and VMS rely heavily on memory-mapped files, especially for mapping the code for executing applications and implementing copy-on-write functionality (because of VAX hardware limitations, VMS provides less efficient copy on demand funtionality). Physical memory management in NT and VMS relies on demand-paged virtual memory. VMS's memory manager assigns each process upper and lower limits (called working sets) for the amount of physical memory the system can assign them. This feature compartmentalizes applications so that an application with heavy memory demands minimally affects other processes. NT's memory manager incorporates working sets, along with many subtleties of the VMS working-set tuning algorithms.

As with the process manager, notable differences exist between NT's and VMS's memory manager. VMS's Balance Set Manager moves entire processes' memory footprints out of memory to paging files and back to memory in response to the overall memory demands of the system. Microsoft did not carry this mechanism, known as swapping, into NT's Balance Set Manager, although some of NT's Balance Set Manager's secondary responsibilities are the same as the secondary responsibilities of VMS's Balance Set Manager.

NT's I/O Manager is closely based on VMS's I/O Manager. Both OS's I/O Manager support a layered-driver model throughout the device driver stacks for different device types and implements asynchronous packet-based I/O commands, and its device drivers dynamically load and unload. Stackable and loadable drivers make NT and VMS very extensible. Either OS can divide functionality among several device drivers, with each driver implementing a different abstraction level. For example, the system can insert a fault-tolerant disk driver between a file system driver and a disk driver. This configuration lets the fault-tolerant disk driver receive a request the system sends to one logical drive (e.g., the C drive), then send the request to multiple physical drives to implement mirroring or striping. Asynchronous I/O enables applications and the kernel subsystems to initiate device requests and work while the requests are in progress, rather than wait idly for the requests to complete. NT's device driver architecture and interrupt-request priority scheme are based on VMS. Descriptions of these aspects of the I/O Manager are applicable to both OSs with little variation.

As you can see by comparing Figure 2 and Figure 3, page 117, the Executive subsystems exhibit the most significant resemblance between VMS and NT. But many minor similarities exist in which it is clear that Microsoft derived NT's capabilities from VMS. For example, both NT and VMS represent resources as objects that the system manages through an Object Manager, which implements uniform reference counting and accounting. The Object Manager regulates resource allocation and calls the Executive subsystem functions that request notification of certain object operations. VMS object management is not formalized, like it is in NT, and the VMS Object Manager is just a loose connection of functions. Microsoft extended NT's Object Manager so that it provides a uniform naming model for all kernel resources.

NT's security subsystem is based on objects with discretionary access control lists. DACLs determine which users can perform various operations on those objects. Digital added a DACL enhancement to VMS's security model in version 4.0 in 1984. Therefore, VMS's security implementation is the predecessor to NT's. Microsoft even included systems tools similar to VMS's in NT, including the Performance Monitor, which is based on MONITOR, the extensible VMS performance monitor. VMS included a utility called BACKUP long before Microsoft developed NT's backup utility.

"Why the Fastest Chip Didn't Win" (Business Week, April 28, 1997) states that when Digital engineers noticed the similarities between VMS and NT, they brought their observations to senior management. Rather than suing, Digital cut a deal with Microsoft. In the summer of 1995, Digital announced Affinity for OpenVMS, a program that required Microsoft to help train Digital NT technicians, help promote NT and Open-VMS as two pieces of a three-tiered client/server networking solution, and promise to maintain NT support for the Alpha processor. Microsoft also paid Digital between 65 million and 100 million dollars.

The Evolution of NT and VMS
Although Microsoft presents NT as a homegrown OS, NT is actually much older than its official 1993 birthdate. NT contains architectural and design influences from another company's flagship OS. Interestingly, throughout the 1990s, Digital introduced many NT features to VMS, and Microsoft has added VMS developments to NT. For example, VMS featured native clustering support in 1984, and 64-bit memory and system APIs in 1996. Microsoft did not introduce clustering support to NT until late last year­and only on a limited scale­and several years might pass before Microsoft releases 64-bit NT. Reciprocally, Microsoft released NT's first version with support for kernel-mode threads, system-wide event logging, and a configuration database called the Registry. VMS introduced kernal-mode threads in VMS 7.0 in 1995, and VMS 7.2 will include NT-style event logging and a Registry.

The saga goes on. Now that Compaq has acquired Digital, will VMS continue to evolve, or will NT seal the fate of its predecessor? One thing is certain: NT will continue to grow, leaving its origins further and further behind.

Read the whole story
tbjohnston
53 days ago
reply
Seattle
Share this story
Delete

How To Take Better Photos on Your Phone

1 Share
Tips for how to take photos on your phone

Nowadays, whether you’re in town or on a trail, most people are walking around with a smartphone in their pocket. These powerful and compact devices come in handy in a number of ways in the wild, from letting you check your location on a topo map to identifying stars in the night sky. And, of course, you can use them to snap high-quality photos of your journey.

Taking a photo with a phone is quick and intuitive: You open the camera app, point and tap. But, when you swipe through your pics from your last big adventure, you might wish more of them were worthy of sharing or being hung up on your wall.

Here are seven tips for taking better outdoor photos on your phone:

1. Know Your Camera

Smartphones come loaded with a default camera app for taking photos. For many people, this app has all the functionality needed to take great pictures. Here are some of the basic features to look for and learn how to use:

    • Exposure adjustment: Many default camera apps let you adjust the exposure of an image. You may need to tap and hold or swipe to do so. Adjusting the exposure gives you control over how light or dark the final image will appear.
    • Focal point and exposure level lock: On many phones, you can press and hold your finger on a subject to lock the focal point and exposure. This ensures that subject will be in focus and exposed just how you want it to be when you click the shutter.
    • HDR mode: When you take a photo in high dynamic range (HDR) mode, your camera actually snaps several pictures at different exposures and then combines them to create a single image. This is a great way to capture the full range of lights and darks that are in a scene. Using HDR can be especially effective when you’re taking pics of beautiful landscapes that have a range of color and brightness, such as sunrise over the countryside. You’ll want to play around with HDR to figure out when to use it and when not to. In some instances HDR doesn’t do such a great job, like when you’re trying to take a picture of a subject in motion.
    • Timer: A timer is handy when you want to be included in the group photo or for snapping pics in low light when even the slightest tap to take the photo can bump the camera and blur the image.
    • Flash on/off: In most cases, you’ll get better results by using natural light rather than resorting to the phone’s harsh built-in flash, so make sure you know how to turn the flash off.
    • Add gridlines: Go into the settings for your camera and turn on the gridlines. With a grid overlaid on your camera, it will be easier to compose your photos and get the horizon line level.

2. Use a Third-Party Camera App

If your default camera app lacks any of the basic features listed above or you long for even greater control when taking pictures, you’ll want to download a third-party camera app (some are free, others you will have to pay for). These apps can unlock a ton of functionality, including:

    • Shutter speed adjustment: Shutter speed is how long the camera’s shutter is open and exposing your phone’s sensor to light. With a faster shutter speed, less light reaches the sensor, which will stop the motion of moving objects. With a slower shutter speed, more light reaches the sensor, which will blur the motion. By being able to adjust shutter speed, you gain a bunch of creative control over the final exposure. For instance, if you’re taking pictures of a friend riding on a bike, you could increase the shutter speed to freeze the motion for a sharp image. Or you could slow down the shutter speed to blur the movement of the rider. If you use a slow shutter speed, you’ll want to put your phone on a small tripod to prevent camera shake from blurring the image.
    • ISO adjustment: Like shutter speed adjustment, ISO adjustment is a way you can control the final exposure of the image. ISO comes from the days of shooting film and was used to rate how sensitive the film was to light. These days it’s used to rate how sensitive the camera’s digital sensor is to light. By increasing the ISO in your phone’s camera, you’re making the sensor more sensitive to light, which will allow you to take pictures in lower light. However, keep in mind that the higher you boost the ISO the more likely it is that you’ll lose a bit of image quality; a higher ISO often results in a grainier image.
    • Burst mode: Burst mode fires off a bunch of pictures in just a matter of seconds. This is great for capturing quick-moving objects, like your buddy launching off a ski jump.
    • Image stabilization: If you’re shooting in low light and struggling to get sharp images, try using an app with image stabilization. This can help eliminate blur caused by camera shake.
    • File formats: Some apps will allow you to save images in different file formats, such as RAW, to preserve image quality. This is especially nice if you plan to do a bit of editing after taking the picture.

3. Use the Rule of Thirds

You know a good photo when you see one. But, what is it exactly that makes it good? A lot of it is in the composition, and one of the most basic compositional techniques is the rule of thirds.

The rule of thirds says that by breaking your image up into thirds vertically and horizontally with gridlines and positioning the subject(s) either along the lines or at the intersection of two lines, you’ll end up with a more balanced and visually interesting picture. For example, rather than composing a photo so that your friend is smack dab in the middle, try taking the picture with them positioned off to one side. Or, when you’re shooting a beautiful landscape, try placing the horizon at the top or bottom third of the photo rather than cutting across the middle. Turning on the grid in your phone’s camera app is an easy way to keep the rule of thirds top of mind while taking pictures.

The rule of thirds

4. Make Use of Leading Lines

Carefully composing lines in your pictures is a powerful way of drawing the viewer’s attention to the main subject and creating a sense of movement through the image. For instance, a picture that shows a hiking trail coming in from the bottom left of the image and going off into the distance at the upper right can guide the viewer’s eye through the image to the solo hiker at the end of the trail. Trails naturally make good leading lines, but so do roads, rivers, shorelines, trees and cliffs.

Make use of leading lines

5. Use the Foreground, Middle Ground and Background

When composing your photos, try to include interesting elements in the foreground, middle ground and/or background. For example, if you’re taking a picture of the sun setting behind a mountainous horizon, rather than only including the horizon line in the photo, try incorporating something interesting in the foreground and/or middle ground to draw the viewer’s eye through the image. This could be a person, a group of trees, some rocks or a beautiful lake.

If you want to play around with blurring the foreground, middle ground and/or background, you’ll need to see if your camera app has a mode that can do this, such as a portrait mode, since most smartphone cameras won’t let you adjust the aperture for a shallow depth of field. You can also add blurring to photos with many editing apps (look for something called lens blur or similar). See tip No. 8 for more about editing your photos.

Use the Foreground, Middle Ground and Background

6. Change Your Perspective

When using a phone camera, it’s easy to get in the habit of always holding your phone in front of you with outstretched arms and snapping photos from head height. But phones are so small and light that you can easily mix things up and shoot from just about any perspective. Try changing your viewpoint by taking photos from a bird’s-eye view or worm’s-eye view. Or, you can shoot from the hip. You can also try getting really close to your subject—as close as your phone will allow while still being able to focus.

7. Add Camera Accessories

There are quite a few add-ons for phone cameras that can help you achieve a certain creative element that you’re trying to capture. For example, if you want to take longer exposures of a beautiful stream in order to blur the movement of water, you’ll need a mini tripod to stabilize your camera. Another option is to add lenses to your phone so you can take fish-eye, macro and wide-angle photos.

Shop Tripods

8. Edit Your Photos

A little photo editing can go a long way toward improving your photos. Most default camera apps that come on smartphones allow some level of editing, but by downloading a third-party app like Snapseed (iPhone or Android) or Adobe Lightroom CC (iPhone or Android) you get a lot more control. Apps like these allow you to adjust brightness, tweak contrast, boost saturation, apply filters, sharpen fuzzy images and a whole lot more. Most of them also have auto adjustments that make it really quick and easy to edit photos.

9. Add Motion

If you’re struggling to capture the grandeur of the scene around you with a still photo, try playing around with time-lapse, slo-mo and video. These come standard on most phone camera apps and can be a fun and beautiful way to capture things like moving water, the setting sun or your friend goofing around on the trail.

 

The post How To Take Better Photos on Your Phone appeared first on REI Co-op Journal.

Read the whole story
tbjohnston
60 days ago
reply
Seattle
Share this story
Delete
Next Page of Stories