How Small Can Computers Get? Computing In A Molecule

How Small Can Computers Get? Computing In A Molecule

ScienceDaily (Dec. 30, 2008) — Over the last 60 years, ever-smaller generations of transistors have driven exponential growth in computing power. Could molecules, each turned into miniscule computer components, trigger even greater growth in computing over the next 60?


Atomic-scale computing, in which computer processes are carried out in a single molecule or using a surface atomic-scale circuit, holds vast promise for the microelectronics industry. It allows computers to continue to increase in processing power through the development of components in the nano- and pico scale. In theory, atomic-scale computing could put computers more powerful than today’s supercomputers in everyone’s pocket.

“Atomic-scale computing researchers today are in much the same position as transistor inventors were before 1947. No one knows where this will lead,” says Christian Joachim of the French National Scientific Research Centre’s (CNRS) Centre for Material Elaboration & Structural Studies (CEMES) in Toulouse, France.

Joachim, the head of the CEMES Nanoscience and Picotechnology Group (GNS), is currently coordinating a team of researchers from 15 academic and industrial research institutes in Europe whose groundbreaking work on developing a molecular replacement for transistors has brought the vision of atomic-scale computing a step closer to reality. Their efforts, a continuation of work that began in the 1990s, are today being funded by the European Union in the Pico-Inside project.

In a conventional microprocessor – the “motor” of a modern computer – transistors are the essential building blocks of digital circuits, creating logic gates that process true or false signals. A few transistors are needed to create a single logic gate and modern microprocessors contain billions of them, each measuring around 100 nanometres.

Transistors have continued to shrink in size since Intel co-founder Gordon E. Moore famously predicted in 1965 that the number that can be placed on a processor would double roughly every two years. But there will inevitably come a time when the laws of quantum physics prevent any further shrinkage using conventional methods. That is where atomic-scale computing comes into play with a fundamentally different approach to the problem.

“Nanotechnology is about taking something and shrinking it to its smallest possible scale. It’s a top-down approach,” Joachim says. He and the Pico-Inside team are turning that upside down, starting from the atom, the molecule, and exploring if such a tiny bit of matter can be a logic gate, memory source, or more. “It is a bottom-up or, as we call it, ‘bottom-bottom’ approach because we do not want to reach the material scale,” he explains.

Joachim’s team has focused on taking one individual molecule and building up computer components, with the ultimate goal of hosting a logic gate in a single molecule.

How many atoms to build a computer?

“The question we have asked ourselves is how many atoms does it take to build a computer?” Joachim says. “That is something we cannot answer at present, but we are getting a better idea about it.”

The team has managed to design a simple logic gate with 30 atoms that perform the same task as 14 transistors, while also exploring the architecture, technology and chemistry needed to achieve computing inside a single molecule and to interconnect molecules.

They are focusing on two architectures: one that mimics the classical design of a logic gate but in atomic form, including nodes, loops, meshes etc., and another, more complex, process that relies on changes to the molecule’s conformation to carry out the logic gate inputs and quantum mechanics to perform the computation.

The logic gates are interconnected using scanning-tunnelling microscopes and atomic-force microscopes – devices that can measure and move individual atoms with resolutions down to 1/100 of a nanometre (that is one hundred millionth of a millimetre!). As a side project, partly for fun but partly to stimulate new lines of research, Joachim and his team have used the technique to build tiny nano-machines, such as wheels, gears, motors and nano-vehicles each consisting of a single molecule.

“Put logic gates on it and it could decide where to go,” Joachim notes, pointing to what would be one of the world’s first implementations of atomic-scale robotics.

The importance of the Pico-Inside team’s work has been widely recognised in the scientific community, though Joachim cautions that it is still very much fundamental research. It will be some time before commercial applications emerge from it. However, emerge they all but certainly will.

“Microelectronics needs us if logic gates – and as a consequence microprocessors – are to continue to get smaller,” Joachim says.

The Pico-Inside researchers, who received funding under the ICT strand of the EU’s Sixth Framework Programme, are currently drafting a roadmap to ensure computing power continues to increase in the future.

Early Americans Faced Rapid Late Pleistocene Climate Change and Man May Have Caused Pre-historic Extinctions

Early Americans Faced Rapid Late Pleistocene Climate Change And Chaotic Environments

ScienceDaily (Feb. 21, 2006) — The environment encountered when the first people emigrated into the New World was variable and ever-changing, according to a Penn State geologist.


“The New World was not a nice quiet place when humans came,” says Dr. Russell Graham, associate professor of geology and director of the Earth & Mineral Sciences Museum.

Archaeologists agree that by 11,000 years ago, people were spread across North and South America, but evidence is building for an earlier entry into the New World, a date that would put human population of North and South America firmly in the Pleistocene.

“We want to know what it was like back then,” says Graham. “What did they have to deal with?”

The Pleistocene Holocene transition took place about 11,000 years ago and caused the extinction of a large number of animal species including mammoths, mastodons and ground sloths. The Holocene looked very different from the Pleistocene.

“We now realize that climate changes extremely rapidly,” Graham told attendees at the annual meeting of the American Association for the Advancement of Science today (Feb.19) in St. Louis, Mo. “The Pleistocene to Holocene transition occurred in about 40 years.”

As a result, animals and plants shifted around and the people living in the New World had to adapt so that they could find the necessary resources to survive. Graham likened the change to the difference between shopping at a WalMart where there is great abundance and large variety — the Pleistocene — to suddenly having to shop at a corner convenience store — the Holocene. In human terms this means that what grandparents knew to be true about finding resources, could be untrue and not helpful to grandchildren.

During the Pleistocene large eastern coastal resources existed, including walruses, south, as far as Virginia, seals and a variety of fish. Mammoth, caribou and mastodons were plentiful across the continent as well as smaller animals. The situation was not identical in all places across North America because, during segments of the Pleistocene, large portions of the Eastern North American continent were covered in ice, while western locations were ice free much further north.

“The Holocene climate is much more stable than the Pleistocene — warmer but more stable,” says Graham. “The environment, however, became more homogeneous, there was less variety.”

Graham argues that the Pleistocene experienced a series of rapid climate changes that created patchiness in the environment, but that once the climate change that signaled the beginning of the Holocene occurred, the climate settled down. Humans coming into the New World during the late Pleistocene would have encountered an environment shaped by rapid changes creating variety in available food sources both animal and vegetable. The groups of people would have to adapt continually and find new resources, but the variety of resources was out there. After the Holocene took hold, there was less need to adapt constantly, but also fewer options in resources.

Archaeologists and geologists debate whether the climate change at the Pleistocene Holocene transition caused the extinction of the mega fauna or if the influx of humans did in the large animals. Graham believes that it was the unstable changing rapidly changing climate, not human predation that killed the large Pleistocene animals.

Man May Have Caused Pre-historic Extinctions

ScienceDaily (May 5, 2006) — New research shows that pre-historic horses in Alaska may have been hunted into extinction by man, rather than by climate change as previously thought.


The discovery by Andrew Solow of Woods Hole Oceanographic Institute, US, David Roberts of the Royal Botanic Garden, Kew and Karen Robbirt of the University of East Anglia (UEA) is published this week in Proceedings of the National Academy of Sciences (PNAS).

The accepted view had previously been that the wild horses became extinct long before the extinction of mammoths and the arrival of humans from Asia – ruling out the possibility that they were over-hunted by man. One theory had been that a period of climate cooling wiped them out.

However, the researchers have discovered that uncertainties in dating fossil remains and the incompleteness of fossil records mean that the survival of the horse beyond the arrival of humans cannot be ruled out.

The PNAS paper develops a new statistical method to help resolve the inherent problems associated with dating fossils from the Pleistocene period. The aim is to provide a far more accurate timetable for the extinction of caballoid horses and mammoths and, ultimately, the cause.

“This research is exciting because it throws open the debate as to whether climate change or over-hunting may have led to the extinction of pre-historic horses in North America,” said UEA’s Karen Robbirt.

The Pleistocene period refers to the first epoch of the Quarternary period between 1.64 million and 10,000 years ago. It was characterised by extensive glaciation of the northern hemisphere and the evolution of modern man around 100,000 years ago.

It is known that the end of the Pleistocene period was a time of large-scale extinctions of animals and plants in North America and elsewhere but the factors responsible have remained open to question, with climate change and over-hunting by humans the prime suspects.

Competition, Not Climate Change, Led To Neanderthal Extinction, Study Shows

Competition, Not Climate Change, Led To Neanderthal Extinction, Study Shows

ScienceDaily (Dec. 30, 2008) — In a recently conducted study, a multidisciplinary French-American research team with expertise in archaeology, past climates, and ecology reported that Neanderthal extinction was principally a result of competition with Cro-Magnon populations, rather than the consequences of climate change.



The study, reported in the online, open-access journal PLoS ONE on December 24, figures in the ongoing debate on the reasons behind the eventual disappearance of Neanderthal populations, which occupied Europe prior to the arrival of human populations like us around 40,000 years ago. Led by Dr William E. Banks, the authors, who belong to the French Centre National de la Recherche Scientifique, l’Ecole Pratique d’Hautes Etudes, and the University of Kansas, reached their conclusion by reconstructing climatic conditions during this period and analyzing the distribution of archaeological sites associated with the last Neanderthals and the first modern human populations with an approach typically used to study the impact of climate change on biodiversity.

This method uses geographic locations of archaeological sites dated by radiocarbon, in conjunction with high-resolution simulations of past climates for specific periods, and employs an algorithm to analyze relationships between the two datasets to reconstruct potential areas occupied by each human population and to determine if and how climatic conditions played a role in shaping these areas. In other words, by integrating archaeological and paleoenvironmental datasets, this predictive method can reconstruct the regions that a past population could potentially have occupied. By repeating the modeling process hundreds of times and evaluating where the errors occur, this machine-learning algorithm is able to provide robust predictions of regions that could have been occupied by specific human cultures.

This modeling approach also allows the projection of the ecological footprint of one culture onto the environmental conditions of a later climatic phase―by comparing this projected prediction to the known archaeological sites dated to this later period, it is possible to determine if the ecological niche exploited by this human population remained the same, or if it contracted or expanded during that period of time.

Comparing these reconstructed areas for Neanderthals and anatomically modern humans during each of the climatic phases concerned, and by projecting each niche onto the subsequent climatic phases, Banks and colleagues determined that Neanderthals had the possibility to maintain their range across Europe during a period of less severe climatic conditions called Greenland Interstadial 8 (GI8).

However, the archaeological record shows that this did not occur, and Neanderthal disappearance occurs at a point when we see the geographic expansion of the ecological niche occupied by modern humans during GI8. The researchers’ models predict the southern limit of the modern human territory to be near the Ebro River Valley in northern Spain during the preceding cold period called Heinrich Event 4 (H4), and that this southern boundary moved to the south during the more temperate phase GI8.

The researchers conclude that the Neanderthal populations that occupied what is now southern Spain were the last to survive because they were able to avoid direct competition with modern humans since the two populations exploited distinct territories during the cold climatic conditions of H4. They also point out that during this population event contact between Neanderthals and modern humans may have permitted cultural and genetic exchanges.

Samuel Huntington’s Warning He predicted a ‘clash of civilizations,’ not the illusion of Davos Man.

Samuel Huntington’s Warning

He predicted a ‘clash of civilizations,’ not the illusion of Davos Man.

The last of Samuel Huntington’s books — “Who Are We? The Challenges to America’s National Identity,” published four years ago — may have been his most passionate work. It was like that with the celebrated Harvard political scientist, who died last week at 81. He was a man of diffidence and reserve, yet he was always caught up in the political storms of recent decades.

[Commentary] Zina Saunders

“This book is shaped by my own identities as a patriot and a scholar,” he wrote. “As a patriot I am deeply concerned about the unity and strength of my country as a society based on liberty, equality, law and individual rights.” Huntington lived the life of his choice, neither seeking controversies, nor ducking them. “Who Are We?” had the signature of this great scholar — the bold, sweeping assertions sustained by exacting details, and the engagement with the issues of the time.

He wrote in that book of the “American Creed,” and of its erosion among the elites. Its key elements — the English language, Christianity, religious commitment, English concepts of the rule of law, the responsibility of rulers, and the rights of individuals — he said are derived from the “distinct Anglo-Protestant culture of the founding settlers of America in the seventeenth and eighteenth centuries.”

Critics who branded the book as a work of undisguised nativism missed an essential point. Huntington observed that his was an “argument for the importance of Anglo-Protestant culture, not for the importance of Anglo-Protestant people.” The success of this great republic, he said, had hitherto depended on the willingness of generations of Americans to honor the creed of the founding settlers and to shed their old affinities. But that willingness was being battered by globalization and multiculturalism, and by new waves of immigrants with no deep attachments to America’s national identity. “The Stars and Stripes were at half-mast,” he wrote in “Who Are We?”, “and other flags flew higher on the flagpole of American identities.”

Three possible American futures beckoned, Huntington said: cosmopolitan, imperial and national. In the first, the world remakes America, and globalization and multiculturalism trump national identity. In the second, America remakes the world: Unchallenged by a rival superpower, America would attempt to reshape the world according to its values, taking to other shores its democratic norms and aspirations. In the third, America remains America: It resists the blandishments — and falseness — of cosmopolitanism, and reins in the imperial impulse.

Huntington made no secret of his own preference: an American nationalism “devoted to the preservation and enhancement of those qualities that have defined America since its founding.” His stark sense of realism had no patience for the globalism of the Clinton era. The culture of “Davos Man” — named for the watering hole of the global elite — was disconnected from the call of home and hearth and national soil.

But he looked with a skeptical eye on the American expedition to Iraq, uneasy with those American conservatives who had come to believe in an “imperial” American mission. He foresaw frustration for this drive to democratize other lands. The American people would not sustain this project, he observed, and there was the “paradox of democracy”: Democratic experiments often bring in their wake nationalistic populist movements (Latin America) or fundamentalist movements (Muslim countries). The world tempts power, and denies it. It is the Huntingtonian world; no false hopes and no redemption.

In the 1990s, when the Davos crowd and other believers in a borderless world reigned supreme, Huntington crossed over from the academy into global renown, with his “clash of civilizations” thesis. In an article first published in Foreign Affairs in 1993 (then expanded into a book), Huntington foresaw the shape of the post-Cold War world. The war of ideologies would yield to a civilizational struggle of soil and blood. It would be the West versus the eight civilizations dividing the rest — Latin American, African, Islamic, Sinic, Hindu, Orthodox, Buddhist and Japanese.

In this civilizational struggle, Islam would emerge as the principal challenge to the West. “The relations between Islam and Christianity, both orthodox and Western, have often been stormy. Each has been the other’s Other. The 20th-century conflict between liberal democracy and Marxist-Leninism is only a fleeting and superficial historical phenomenon compared to the continuing and deeply conflictual relation between Islam and Christianity.”

He had assaulted the zeitgeist of the era. The world took notice, and his book was translated into 39 languages. Critics insisted that men want Sony, not soil. But on 9/11, young Arabs — 19 of them — would weigh in. They punctured the illusions of an era, and gave evidence of the truth of Huntington’s vision. With his typical precision, he had written of a “youth bulge” unsettling Muslim societies, and young, radicalized Arabs, unhinged by modernity and unable to master it, emerging as the children of this radical age.

If I may be permitted a personal narrative: In 1993, I had written the lead critique in Foreign Affairs of his thesis. I admired his work but was unconvinced. My faith was invested in the order of states that the West itself built. The ways of the West had become the ways of the world, I argued, and the modernist consensus would hold in key Third-World countries like Egypt, India and Turkey. Fifteen years later, I was given a chance in the pages of The New York Times Book Review to acknowledge that I had erred and that Huntington had been correct all along.

A gracious letter came to me from Nancy Arkelyan Huntington, his wife of 51 years (her Armenian descent an irony lost on those who dubbed him a defender of nativism). He was in ill-health, suffering the aftermath of a small stroke. They were spending the winter at their summer house on Martha’s Vineyard. She had read him my essay as he lay in bed. He was pleased with it: “He will be writing you himself shortly.” Of course, he did not write, and knowing of his frail state I did not expect him to do so. He had been a source of great wisdom, an exemplar, and it had been an honor to write of him, and to know him in the regrettably small way I did.

We don’t have his likes in the academy today. Political science, the field he devoted his working life to, has been in the main commandeered by a new generation. They are “rational choice” people who work with models and numbers and write arid, impenetrable jargon.

More importantly, nowadays in the academy and beyond, the patriotism that marked Samuel Huntington’s life and work is derided, and the American Creed he upheld is thought to be the ideology of rubes and simpletons, the affliction of people clinging to old ways. The Davos men have perhaps won. No wonder the sorrow and the concern that ran through the work of Huntington’s final years.

Mr. Ajami is professor of Middle East Studies at The Johns Hopkins University, School of Advanced International Studies. He is also an adjunct research fellow at Stanford University’s Hoover Institution.

Please add your comments to the Opinion Journal forum.

MRC’s Graham Discusses Worst Bias of 2008 on ‘O’Reilly Factor’

MRC’s Graham Discusses Worst Bias of 2008 on ‘O’Reilly Factor’

Praise the Lord and pass the video clips!

What do Bill Maher slamming Pope Benedict XVI as the criminal head of a pedophilia ring, Washington Post’s Sally Quinn defending anti-American Rev. Jeremiah Wright, and Ted Turner founder prophesying environmental apocalypse have in common?

They are just three of the most outrageous quotes from the mainstream media in 2008 and were featured on the December 23 “O’Reilly Factor” in a segment with MRC’s Director of Media Analysis Tim Graham.

You can view the segment by clicking the link below.

http://newsbusters.org/blogs/nb-staff/2008/12/29/mrcs-graham-discusses-worst-bias-2008-oreilly

Homicide Rates: Back toward Crack among Blacks

Homicide Rates: Back toward Crack among Blacks

After a huge dropoff with the ending of the crack wars around 1995, the black homicide perpetration rate has turned up again in this decade. For black male 14-17 year olds, according to tables prepared by James Alan Fox of Northeastern U., the number of homicide perpetrators in absolute terms is up 34% from 2000-2001 to 2006-2007, up 12% for black 18-24-year-0lds, and up 17% for blacks men 25+.

In contrast, for “whites” (which appear to include most Hispanics), the number of homicide perpetrators is up 3% for 14-17 year-olds, down -2% for 18-24 year-olds, and up 6% for 25+. The federal government carefully breaks out Hispanic data for almost everything except crime statistics, which makes non-black crime numbers hard to interpret. My guess would be that the homicide rate for whites/Hispanics is falling because the number of whites/Hispanics is growing rapidly due to Hispanic growth. Unfortunately, we can’t use federal figures to break down white versus Hispanic crime trends, but I would guess that crime rate trends are pretty quiet among both whites and Hispanics in this decade.

Here in LA, there was a spike in Hispanic gang murders after Villaraigosa was elected mayor in 2005, but the LAPD remains in the capable hands of William Bratton, and that has faded out.

My assumption is that technological trends, especially the spread of cellphones and cellphone cameras, has made crime a riskier business, so crime rates should be dropping all else being equal.

When I debated economist Steven Levitt over crime in Slate in 1999, he asked me what my prediction for future crime trends was: I replied that I figured that black teens are currently benefiting from the example of their many older brothers and cousins whom the crack wars left in jail, wheelchairs, or cemeteries, but that eventually a new cohort of black teens would come along without direct experience of the horrors of crack wars of 1988-1994, and the homicide rate would go back up again.

Chicago, America’s most segregated big city

Chicago, America’s most segregated big city

Racial lines were drawn over the city’s history and remain entrenched by people’s choice, economics

By Azam Ahmed and Darnell Little

Tribune reporters

December 26, 2008

The paths taken by Colin Lampark and Rosalyn Bates help illustrate why Chicago is the most racially segregated big city in America.

Both are young professionals with handsome earning potential. Both moved to the city a few years ago—Lampark, 28, to Lincoln Park; Bates, 31, to Bronzeville. And both chose neighborhoods reflecting their race, a practice common in Chicago.

Their personal stories, and many others, explain why blacks in Chicago are the most isolated racial group in the nation’s 20 largest cities, according to a Tribune analysis of 2008 population estimates. To truly integrate Chicago, 84 percent of the black or white population would need to change neighborhoods, the data show.

The calculations paint a starkly different picture from the ones broadcast across the nation during Barack Obama‘s Election Night rally last month, when his hometown looked like one unified, harmonious city.

The fact is, racial patterns that took root in the 1800s are not easy to reverse. Racial steering, discriminatory business practices and prejudice spawned segregation in Chicago, and now personal preferences and economics fuel it.

“Once institutions exist, they tend to persist, and it requires some act of force to get them to change,” said Douglas Massey of Princeton University, an expert on segregation.

For Lampark, who is white, the move last year to Lincoln Park from Minneapolis came because he had friends there. It wasn’t a racially motivated decision, he said. Lampark, an engineer, just doesn’t know anyone on the South Side.

Bates, who is black, settled in Bronzeville for similar reasons.

“It put us closer to friends,” she said.

She, however, may pay more dearly for her decision. Segregated African-American neighborhoods have less access to health care, quality education and employment opportunities than white areas, the research shows. Black homeowners can expect to receive 18 percent less value for their homes, according to one study—a tax the researcher attributed primarily to segregation.

James Hamilton, 50, a deckhand from Woodlawn, can live with that. In his experience, which includes 30 years on the South Side, he doesn’t think that whites would welcome him to their neighborhood.

“It ain’t never been us,” he said. “It’s always been [whites]—just don’t want to be around us.”

The research shows he may not be entirely wrong. While whites are willing to vote for Obama, they aren’t nearly as interested in living in neighborhoods rich in color.

Blacks make up about 35 percent of Chicago’s population of nearly 3 million and are largely concentrated on the South and West Sides. Whites make up nearly 28 percent, largely located to the north and in slivers of the South Side, while Hispanics, about 30 percent of the population, are scattered to the Northwest and Southwest Sides of the city center.

Dating back to the late 19th Century, blacks were confined to certain neighborhoods in Chicago by pen and sword, with legal restrictions and real estate practices ensuring whatever bombs and batons did not.

During the Great Migration in the early 20th Century, hundreds of thousands of blacks followed those patterns of settlement, creating densely populated communities on the South Side that hardened racial fault lines.

Real estate agents showing people homes only in certain neighborhoods and restrictive covenants guaranteed that blacks did not spread across the city or into the suburbs. Redlining ensured that black areas received less financing and investment.

Slum clearance and urban renewal in the 1940s and ’50s displaced more blacks. Most found housing in the deeper South Side, in areas rapidly turning over with the onset of white flight. The poorest moved into public housing, which transformed into housing largely for blacks.

The city decided to build high-rises for public-housing residents, a move that would prove fatal to hopes for integration. White aldermen refused to place the high-rises in their wards, so nearly all were placed in black areas.

“By the time civil rights comes along, the die has already been cast,” said Arnold Hirsch, a historian at the University of New Orleans and author of “Making the Second Ghetto: Race and Housing in Chicago 1940-1960.”

“It’s no longer how you set up something, but how do you uproot something that’s already taken hold,” Hirsch said.

More recently, income differences between racial groups have helped further entrench separation, clustering lower-income minorities into urban ghettos that beget further isolation.

But perhaps the most controversial driver of segregation today in cities such as Chicago is personal taste: People tend to select areas where their own color has a large presence or they have some familiarity.

“It plays a huge role because the neighborhoods have been firmly established, and Chicago has had a greater history of racial segregation than other cities,” said William Julius Wilson, professor of sociology and public policy at Harvard University.

Chicago’s history meant that churches and family networks for whites and blacks developed in separate areas.

Those connections prompted Reginald Halbert’s move to Kenwood 10 years ago. Halbert, who had been living in the suburbs, considered the North Side but decided to build his gated home on the South Side, where he grew up.

“We wanted to be in close proximity to all the things that matter to us,” said Halbert, 44. “Our work, our family and our religious institutions.”

Some studies show that blacks tend to prefer a more diverse neighborhood, something closer to a 50-50 split of blacks and whites, but those tend not to exist in a city as old as Chicago.

Research indicates that whites tend to have a lower tolerance for blacks and other minorities. A 2000 study found that whites prefer neighborhoods where they are nearly 60 percent of the population and blacks represent about 17 percent.

One theory posits that whites associate black neighborhoods with high crime and poor-quality schools. A recent study conducted in the Chicago and Detroit areas by the University of Illinois at Chicago and University of Michigan found that whites consistently rate a neighborhood higher when its residents are white regardless of the physical quality of the neighborhood.

Not only do the studies show a white reluctance to move into black neighborhoods, research shows that the share of whites who say they would leave a neighborhood grows as the proportion of black residents increases. That has proved true in Chicago.

“Chicago is a very, very large city with a large population of Hispanics and blacks and a declining white population,” said Harvard’s Wilson. “But it’s still a city in which people can find housing in other areas, and as long as there are areas to which whites can retreat, it will be difficult to reduce the overall segregation.”

Cities with smaller black populations, such as Tucson, Ariz., or Seattle, show greater integration. Chicago’s large black population would exceed most white thresholds, experts say.

Another factor that separates Chicago from other places is its age. Older cities in the Midwest and Northeast were established before restrictive housing policies were outlawed. Experts say more newly developed cities—such as Austin, Texas; San Jose, Calif.; and Charlotte, N.C.—are likely to see higher levels of integration.

Said Jacob Vigdor, an economist at Duke University: “What integration requires is the presence of blank slates.”

Even then, federal studies of equally matched black and white couples show that unequal racial treatment for both renters and buyers still exists.

“We live in a country where we think people should be able to move freely, so we don’t have a lot of policies or laws that either encourage or constrain people’s residential choices,” said Mary Pattillo, a professor at Northwestern University. “Our laws that are supposed to defend against discrimination put the burden on the individual.”

A final factor often cited as a reason that segregation persists is economics. Poor end up living with poor, and because blacks maintain the lowest place on the socioeconomic food chain, they are often lumped together.

But research shows that blacks largely remain segregated from whites across income levels, though to a lesser extent than 30 years ago.

Many higher-income African-Americans who could afford to live anywhere in the city choose to live among blacks, even at the expense of wealth accumulation in their homes.

“It provides a certain comfort for middle-class African-Americans who may work in a corporate environment where they are minorities to live in a neighborhood where they aren’t a minority,” said Richard Pierce, chairman of the Africana studies department at the University of Notre Dame.

Bates, of Bronzeville, might fit into that category. A clinical therapist, she and her attorney sister canvassed much of the city before selecting a neighborhood.

“There is a comfort level being among people of your own race,” she said. “I don’t think that there was any intention of segregation behind that.”