Wednesday, April 21, 2010

Nanoengineering

Nanoengineering is the practice of engineering on the nanoscale. It derives its name from the nanometre, a unit of measurement equalling one billionth of a meter.

Nanoengineering is largely a synonym for nanotechnology, but emphasizes the engineering rather than the pure science aspects of the field.

The first nanoengineering program in the world was started at the University of Toronto within the Engineering Science program as one of the Options of study in the final years. In 2003, the Lund Institute of Technology started a program in Nanoengineering. In 2005, the University of Waterloo established a unique program which offers a full degree in Nanotechnology Engineering. [1] The University of California, San Diego followed shortly thereafter in 2007 with its own department of Nanoengineering.

Techniques

  • Photolithography - Using light to produce patterns in chemicals, and then etching to expose the surface.
  • Electron beam lithography - Similar to photolithography, but using electron beams instead of light.
  • Scanning tunneling microscope (STM) - Can be used to both image, and to manipulate structures as small as a single atom.
  • Molecular self-assembly - Arbitrary sequences of DNA can now be synthesized cheaply in bulk, and used to create custom proteins or regular patterns of amino acids. Similarly, DNA strands can bind to other DNA strands, allowing simple structures to be created.



Nanohub

nanoHUB.org is science cyberinfrastructure[1] comprising community-contributed resources and geared toward educational applications, professional networking, and interactive simulation tools for nanotechnology. Funded by the National Science Foundation (NSF), it is a product of the Network for Computational Nanotechnology (NCN), a multi-university initiative of eight member institutions including Purdue University, the University of California at Berkeley, the University of Illinois at Urbana-Champaign, the Molecular Foundry at Lawrence Berkeley National Laboratory, Norfolk State University, Northwestern University, and the University of Texas at El Paso. NCN was established to create a resource for nanoscience and nanotechnology via on-line services for research, education, and professional collaboration. NCN supports research efforts in nanoelectronics; nano electro-mechanical systems, or NEMS; nanofluidics; nanomedicine/biology; and nanophotonics.

Resources

nanoHUB.org is NCN's gateway for delivery and hosting of simulation tools, course materials, lectures, seminars, tutorials, user groups, and online meetings. nanoHUB hosts approximately 1200 user-contributed resources[2][3] which focus on nanotechnology education and simulation for topics related to nanotechnology. nanoHUB interactive simulation tools are accessible from web browsers and run via a distributed computing network at Purdue University, as well as the TeraGrid and Open Science Grid. These resources are provided by approximately 500 member contributors[4] in the nanoscience community.

nanoHUB resources include:[5]

  • Online Seminars
  • Online Group Meeting Rooms
  • Virtual Linux Workspaces that facilitate tool development within an in-browser Linux machine
  • Online Workshops
  • User Groups
  • Interactive Simulation Tools for nanotechnology and related fields
  • Lectures, Podcasts & Learning Materials in multiple formats
  • Course Curricula for educators
  • News & Events for Nanotechnology

Simulation Tools

nanoHUB provides in-browser simulation tools geared toward nanotechnology, electrical engineering, chemistry, and semiconductor education. nanoHUB simulations are available to users as both stand-alone tools and part of structured teaching and learning curricula comprising numerous tools. As of December 31, 2009, nanoHUB hosts 160 distinct simulation tools[6]. nanoHUB users develop and contribute their own tools for live deployment.

Examples of some tools include:


SCHRED, which calculates the envelope wavefunctions and the corresponding bound-state energies in a typical MOS (Metal-Oxide-Semiconductor) or SOS (Semiconductor-Oxide-Semiconductor) structure and a typical SOI structure by solving self-consistently the one-dimensional (1D) Poisson equation and the 1D Schrodinger equation.


Quantum Dot Lab, which computes the eigenstates of a particle in a box of various shapes including domes and pyramids.


Bulk Monte Carlo Tool, which calculates the bulk values of the electron drift velocity, electron average energy and electron mobility for electric fields applied in arbitrary crystallographic direction in both column 4 (Si and Ge) and III-V (GaAs, SiC and GaN) materials.


Crystal Viewer helps in visualizing various types of Bravais lattices, planes and Miller indices needed for many material, electronics and chemistry courses. Also large bulk systems for different materials (Silicon, InAs, GaAs, diamond, graphene, Buckyball) can be viewed using this tool.


Band Structure Lab which uses the sp3s*d5 tight binding method to compute E(k) for bulk, planar, and nanowire semiconductors. Using this tool, researchers may compute and visualize the band structures of bulk semiconductors, thin films, and nanowires for various materials, growth orientations, and strain conditions. Physical parameters such as the bandgap and effective mass can also be obtained from the computed E(k). The bandedges and effective masses of the bulk materials and the nanostructures structures can be analyzed as a function of various strain conditions.

Infrastructure

nanoHUB utilizes numerous resources for the development, deployment, and presentation of simulations and related materials. These include the HUBzero, the RAPPTURE interface, and Maxwell's Daemon middleware.

HUBzero Platform for Scientific Collaboration

nanoHUB is powered by the HUBzero[7] software developed at Purdue University. HUBzero allows individuals to create web sites that connect a community in scientific research and educational activities. HUBzero sites combine Web 2.0 concepts with middleware that provides access to interactive simulation tools including access to TeraGrid[8], the Open Science Grid, and other national grid computing resources.

HUBzero was created by researchers at Purdue University in conjunction with the NSF-sponsored Network for Computational Nanotechnology. The hub is a web site built from open source software: the Linux operating system, the Apache web server, the MySQL database, the Joomla content management system, and the PHP web scripting language. The HUBzero software allows individuals to access simulation tools and share information. Sites using the hub infrastructure are standardized with the following modules :

  • Interactive simulation tools, hosted on the hub cluster and delivered to web browsers
  • Simulation tool development area, including source code control and bug tracking
  • Animated presentations delivered in Flash format
  • Mechanism for uploading and sharing resources
  • 5-star ratings and user feedback for resources
  • User support area, with question-and-answer forum
  • Statistics about users and usage patterns

Rappture Toolkit

The Rappture (Rapid APPlication infrastrucTURE) toolkit provides the basic infrastructure for the development of a large class of scientific applications, allowing scientists to focus on their core algorithm. It does so in a language-neutral fashion, so one may access Rappture in a variety of programming environments, including C/C++, Fortran and Python. To use Rappture, a developer describes all of the inputs and outputs for the simulator, and Rappture generates a Graphical User Interface (GUI) for the tool automatically [9].

Workspaces

A workspace is an in-browser Linux desktop that provides access to NCN's Rappture toolkit, along with computational resources available on the NCN, Open Science Grid, and TeraGrid networks. One can use these resources to conduct research, or as a development area for new simulation tools. One may upload code, compile it, test it, and debug it. Once code is tested and working properly in a workspace, it can be deployed as a live tool on nanoHUB.

A user can use normal Linux tools to transfer data into and out of a workspace. For example, sftp yourlogin@sftp.nanohub.org will establish a connection with a nanoHUB file share. Users can also use built-in WebDAV support on Windows, Macintosh, and Linux operating systems to access theier nanoHUB files on a local desktop.

Middleware

The web server uses a daemon to dynamically relay incoming VNC connections to the execution host on which an application session is running. Instead of using the port router to set up a separate channel by which a file import or export operation is conducted, it uses VNC to trigger an action on the browser which relays a file transfer through the main nanoHUB web server. The primary advantage of consolidating these capabilities into the web server is that it limits the entry point to the nanoHUB to one address: www.nanohub.org. This simplifies the security model as well as reduces on the number of independent security certificates to manage.

One disadvantage of consolidating most communication through the web server is the lack of scalability when too much data is transferred by individual users. In order to avoid a network traffic jam, the web server can be replicated and clustered into one name by means of DNS round-robin selection.

The backend execution hosts that support Maxwell can operate with conventional Unix systems, Xen virtual machines, and a form of virtualization based on OpenVZ. For each system, a VNC server is pre-started for every session. When OpenVZ is used, that VNC server is started inside of a virtual container. Processes running in that container cannot see other processes on the physical system, see the CPU load imposed by other users, dominate the resources of the physical machine, or make outbound network connections. By selectively overriding the restrictions imposed by OpenVZ, it is possible to synthesize a fully private environment for each application session that the user can use remotely.[10]

Usage

The majority of users come from academic institutions using nanoHUB as part of their research and educational activities. Users also come from national labs and private industry. As a scientific resource, nanoHUB is cited more than 260 times in the scientific literature [11][12]. Approximately sixty percent of the citations stem from authors not affiliated with the NCN. More than 200 of the citations refer to nanotechnology research, with more than 150 of them citing concrete resource usage. Twenty citations elaborate on nanoHUB use in education and more than 30 refer to nanoHUB as an example of national cyberinfrastructure.

Nanoscale network

Nanoscale network, short for Nanscale communication network, is a telecommunications technology that provides networked communication among objects that are the size of biological cells and smaller[1][2]. The goal is to achieve efficient network communication at the smallest possible scale. The communication channels are on the order of individual molecules.


Approaches

Nanoscale and molecular networks are being developed by a variety of means, for example, leveraging biology including molecular motors, calcium signaling, and gap junctions as well as non-biological mechanisms such as diffusion, carbon nanotubes, and quantum mechanical phenomena.

External Links

IEEE NanoCom Subcommittee

Notes

  1. ^ Nanoscale Communication Networks, Bush, S. F., ISBN 978-1-60807-003-9, Artech House, 2010 [1]
  2. ^ UCI Netgroup Wiki on Molecular Communication

Translational research

Translational research is another term for traslative research and translational science, although it fails to disambiguate itself from forms of research that are not scientific (e.g., market research), which are currently considered outside its scope. Translational research is a way of thinking about and conducting scientific research to make the results of research applicable to the population under study and is practised in the natural and biological, behavioural, and social sciences. In the field of medicine, for example, it is used to translate the findings in basic research more quickly and efficiently into medical practice and, thus, meaningful health outcomes, whether those are physical, mental, or social outcomes. In medicine in particular, governmental funders of research and pharmaceutical companies have spent vast amounts internationally on basic research and have seen that the return on investment is significantly less than anticipated. Translational research has come to be seen as the key, missing component.

With its focus on removing barriers to multi-disciplinary collaboration, translational research has the potential to drive the advancement of applied science. An attempt to bridge these barriers has been undertaken particularly in the medical domain where the term translational medicine has been applied to a research approach that seeks to move “from bench to bedside” or from laboratory experiments through clinical trials to actual point-of-care patient applications. However, it should be noted that the term translational medicine is a misnomer, as medicine is not a science: it is the clinical practice of healing the given individual whereas science addresses principles and populations. This distinction is the primary reason that science needs to be translated at all. "Translational medicine" would best be termed "Translational medical science".

Comparison to basic research or applied research

Translational research is a paradigm for research alternative to the dichotomy of basic research and applied research. It is often applied in the domain of medicine but has more general applicability as a distinct research approach. It is also allied in practice with the approaches of participative science and participatory action research.

The traditional categorization of research identifies just two categories: basic research (also labelled fundamental or pure research) and applied research. Basic research is more speculative and takes a long time – often measured in decades – to be applied in any practical context. Basic research often leads to breakthroughs or paradigm-shifts in practice. Applied research on the other hand is characterised as being capable of having an impact in practice within a relatively short time, but would often represent an incremental improvement to current processes rather than delivering radical breakthroughs.

The cultural separation between different scientific fields makes it difficult to establish the multidisciplinary and multi-skilled teams that are necessary to be successful in translational research. Other challenges arise in the traditional incentives which reward individual principal investigators over the types of multi-disciplinary teams that are necessary for translational research. Also, journal publication norms often require tight control of experimental conditions, and these are difficult to achieve in real-world contexts [1].

Outside of the medical domain, this mode of research can be applied more generally where researchers seek to shorten the time-frame and conflate the basic-applied continuum to ‘translate’ fundamental research results into practical applications. It is of necessity a much more iterative style of research with low and permeable barriers and a great deal of interaction between academic research and industry practice. Practitioners help shape the research agenda in supplying what may be intractable problems to which applied research approaches will only offer incremental improvements.

Challenges in translational research

To flourish, translational research requires a knowledge-driven ecosystem, in which constituents generate, contribute, manage and analyze data available from all parts of the landscape. The goal is a continuous feedback loop to accelerate the translation of data into knowledge. Collaboration, data sharing, data integration and standards are integral. Only by seamlessly structuring and integrating these data types will the complex and underlying causes and outcomes of illness be revealed, and effective prevention, early detection and personalized treatments be realized.

Translational research requires that information and data flow from hospitals, clinics and participants of studies in an organized and structured format to repositories and research-based facilities and laboratories. Furthermore, the scale, scope and multi-disciplinary approach that translational research requires means a new level of operations management capabilities within and across studies, repositories and laboratories. Meeting the increased operational requirements of larger studies, with ever increasing specimen counts, larger and more complex systems biology data sets, and government regulations, precipitates an informatics approach that enables the integration of both operational capabilities and clinical and basic data. Most informatics systems in use today are inadequate in terms of handling the tasks of complicated operations and contextually in data management and analysis.

Definitions of translational research

Translational research is a new and rapidly evolving domain. As such, the definition is bound to evolve over time. Generally speaking, the primary goal of “translational” research is to integrate advancements in molecular biology with clinical trials, taking research from the “bench-to-bedside”. [2][3]


Websites that help to define translational research:

Top-down and bottom-up design

Top-down and bottom-up are strategies of information processing and knowledge ordering, mostly involving software, but also other humanistic and scientific theories (see systemics). In practice, they can be seen as a style of thinking and teaching. In many cases top-down is used as a synonym of analysis or decomposition, and bottom-up of synthesis.

A top-down approach is essentially the breaking down of a system to gain insight into its compositional sub-systems. In a top-down approach an overview of the system is first formulated, specifying but not detailing any first-level subsystems. Each subsystem is then refined in yet greater detail, sometimes in many additional subsystem levels, until the entire specification is reduced to base elements. A top-down model is often specified with the assistance of "black boxes", these make it easier to manipulate. However, black boxes may fail to elucidate elementary mechanisms or be detailed enough to realistically validate the model.

A bottom-up approach is the piecing together of systems to give rise to grander systems, thus making the original systems sub-systems of the emergent system. In a bottom-up approach the individual base elements of the system are first specified in great detail. These elements are then linked together to form larger subsystems, which then in turn are linked, sometimes in many levels, until a complete top-level system is formed. This strategy often resembles a "seed" model, whereby the beginnings are small but eventually grow in complexity and completeness. However, "organic strategies" may result in a tangle of elements and subsystems, developed in isolation and subject to local optimization as opposed to meeting a global purpose.

Top-down approach is also known as step-wise design.

Computer science

Software development


In the software development process, the top-down and bottom-up approaches play a key role.

Top-down approaches emphasize planning and a complete understanding of the system. It is inherent that no coding can begin until a sufficient level of detail has been reached in the design of at least some part of the system. The Top-Down Approach is done by attaching the stubs in place of the module. This, however, delays testing of the ultimate functional units of a system until significant design is complete. Bottom-up emphasizes coding and early testing, which can begin as soon as the first module has been specified. This approach, however, runs the risk that modules may be coded without having a clear idea of how they link to other parts of the system, and that such linking may not be as easy as first thought. Re-usability of code is one of the main benefits of the bottom-up approach.[citation needed]

Top-down design was promoted in the 1970s by IBM researcher Harlan Mills and Niklaus Wirth. Mills developed structured programming concepts for practical use and tested them in a 1969 project to automate the New York Times morgue index. The engineering and management success of this project led to the spread of the top-down approach through IBM and the rest of the computer industry. Among other achievements, Niklaus Wirth, the developer of Pascal programming language, wrote the influential paper Program Development by Stepwise Refinement. Since Niklaus Wirth went on to develop languages such as Modula and Oberon (where one could define a module before knowing about the entire program specification), one can infer that top down programming was not strictly what he promoted. Top-down methods were favored in software engineering until the late 1980s, and object-oriented programming assisted in demonstrating the idea that both aspects of top-down and bottom-up programming could be utilized.

Modern software design approaches usually combine both top-down and bottom-up approaches. Although an understanding of the complete system is usually considered necessary for good design, leading theoretically to a top-down approach, most software projects attempt to make use of existing code to some degree. Pre-existing modules give designs a bottom-up flavour. Some design approaches also use an approach where a partially-functional system is designed and coded to completion, and this system is then expanded to fulfill all the requirements for the project

Programming

Top-down is a programming style, the mainstay of traditional procedural languages, in which design begins by specifying complex pieces and then dividing them into successively smaller pieces. Eventually, the components are specific enough to be coded and the program is written. This is the exact opposite of the bottom-up programming approach which is common in object-oriented languages such as C++ or Java.

The technique for writing a program using top-down methods is to write a main procedure that names all the major functions it will need. Later, the programming team looks at the requirements of each of those functions and the process is repeated. These compartmentalized sub-routines eventually will perform actions so simple they can be easily and concisely coded. When all the various sub-routines have been coded the program is done.

By defining how the application comes together at a high level, lower level work can be self-contained. By defining how the lower level abstractions are expected to integrate into higher level ones, interfaces become clearly defined.

Top-down approach

Practicing top-down programming has several advantages:

  • Separating the low level work from the higher level abstractions leads to a modular design.
  • Modular design means development can be self contained.
  • Having "skeleton" code illustrates clearly how low level modules integrate.
  • Fewer operations errors (to reduce errors, because each module has to be processed separately, so programmers get large amount of time for processing).
  • Much less time consuming (each programmer is only involved in a part of the big project).
  • Very optimized way of processing (each programmer has to apply their own knowledge and experience to their parts (modules), so the project will become an optimized one).
  • Easy to maintain (if an error occurs in the output, it is easy to identify the errors generated from which module of the entire program).

Bottom-up approach

Pro/ENGINEER WF4.0 proetools.com - Lego Racer Pro/ENGINEER Parts is a good example of bottom-up design because the parts are first created and then assembled without regard to how the parts will work in the assembly.

In a bottom-up approach the individual base elements of the system are first specified in great detail. These elements are then linked together to form larger subsystems, which then in turn are linked, sometimes in many levels, until a complete top-level system is formed. This strategy often resembles a "seed" model, whereby the beginnings are small, but eventually grow in complexity and completeness.

Object-oriented programming (OOP) is a programming paradigm that uses "objects" to design applications and computer programs.

In Mechanical Engineering with software programs such as Pro/ENGINEER, Solidworks, and Autodesk Inventor users can design products as pieces not part of the whole and later add those pieces together to form assemblies like building LEGO. Engineers call this piece part design.

This bottom-up approach has one weakness. We need to use a lot of intuition to decide the functionality that is to be provided by the module. If a system is to be built from existing system, this approach is more suitable as it starts from some existing modules.

Pro/ENGINEER (as well as other commercial Computer Aided Design (CAD) programs) does however hold the possibility to do Top-Down design by the use of so-called skeletons. They are generic structures that hold information on the overall layout of the product. Parts can inherit interfaces and parameters from this generic structure. Like parts, skeletons can be put into a hierarchy. Thus, it is possible to build the over all layout of a product before the parts are designed.

Parsing

Parsing is the process of analyzing an input sequence (such as that read from a file or a keyboard) in order to determine its grammatical structure. This method is used in the analysis of both natural languages and computer languages, as in a compiler.

Bottom-up parsing is a strategy for analyzing unknown data relationships that attempts to identify the most fundamental units first, and then to infer higher-order structures from them. Top-down parsers, on the other hand, hypothesize general parse tree structures and then consider whether the known fundamental structures are compatible with the hypothesis. See Top-down parsing and Bottom-up parsing.

Nanotechnology

Top-down and bottom-up are two approaches for the manufacture of products. These terms were first applied to the field of nanotechnology by the Foresight Institute in 1989 in order to distinguish between molecular manufacturing (to mass-produce large atomically precise objects) and conventional manufacturing (which can mass-produce large objects that are not atomically precise). Bottom-up approaches seek to have smaller (usually molecular) components built up into more complex assemblies, while top-down approaches seek to create nanoscale devices by using larger, externally-controlled ones to direct their assembly.

The top-down approach often uses the traditional workshop or microfabrication methods where externally-controlled tools are used to cut, mill, and shape materials into the desired shape and order. Micropatterning techniques, such as photolithography and inkjet printing belong to this category.

Bottom-up approaches, in contrast, use the chemical properties of single molecules to cause single-molecule components to (a) self-organize or self-assemble into some useful conformation, or (b) rely on positional assembly. These approaches utilize the concepts of molecular self-assembly and/or molecular recognition. See also Supramolecular chemistry.

Such bottom-up approaches should, broadly speaking, be able to produce devices in parallel and much cheaper than top-down methods, but could potentially be overwhelmed as the size and complexity of the desired assembly increases.

Neuroscience and psychology

An example of top down processing: Even though the second letter in each word is ambiguous, top down processing allows for easy disambiguation based on the context.

These terms are also employed in neuroscience and psychology. The study of visual attention provides an example. If your attention is drawn to a flower in a field, it may be simply that the flower is more visually salient than the surrounding field. The information that caused you to attend to the flower came to you in a bottom-up fashion — your attention was not contingent upon knowledge of the flower; the outside stimulus was sufficient on its own.

Contrast this situation with one in which you are looking for a flower. You have a representation of what you are looking for. When you see the object you are looking for, it is salient. This is an example of the use of top-down information.

In cognitive terms, two thinking approaches are distinguished. "Top down" (or "big chunk") is stereotypically the visionary, or the person who sees the larger picture and overview. Such people focus on the big picture and from that derive the details to support it. "Bottom up" (or "small chunk") cognition is akin to focusing on the detail primarily, rather than the landscape. The expression "seeing the wood for the trees" references the two styles of cognition.

Management and organization

In management and organizational arenas, the terms "top down" and "bottom up" are used to indicate how decisions are made.

A "top down" approach is one where an executive, decision maker, or other person or body makes a decision. This approach is disseminated under their authority to lower levels in the hierarchy, who are, to a greater or lesser extent, bound by them. For example, a structure in which decisions either are approved by a manager, or approved by his or her authorized representatives based on the manager's prior guidelines, is top-down management.

A "bottom up" approach is one that works from the grassroots — from a large number of people working together, causing a decision to arise from their joint involvement. A decision by a number of activists, students, or victims of some incident to take action is a "bottom-up" decision.

Positive aspects of top-down approaches include their efficiency and superb overview of higher levels. Also, external effects can be internalized. On the negative side, if reforms are perceived to be imposed ‘from above’, it can be difficult for lower levels to accept them (e.g. Bresser Pereira, Maravall, and Przeworski 1993). Evidence suggests this to be true regardless of the content of reforms (e.g. Dubois 2002). A bottom-up approach allows for more experimentation and a better feeling for what is needed at the bottom.

State organization

Both approaches can be found in the organization of states, this involving political decisions.

In bottom-up organized organizations, e.g. ministries and their subordinate entities, decisions are prepared by experts in their fields, which define, out of their expertise, the policy they deem necessary. If they cannot agree, even on a compromise, they escalate the problem to the next higher hierarchy level, where a decision would be sought. Finally, the highest common principal might have to take the decision. Information is in the debt of the inferior to the superior, which means that the inferior owes information to the superior. In the effect, as soon as inferiors agree, the head of the organization only provides his or her “face″ for the decision which their inferiors have agreed upon.

Among several countries, the German political system provides one of the purest forms of a bottom-up approach. The German Federal Act on the Public Service provides that any inferior has to consult and support any superiors, that he or she – only – has to follow “general guidelines" of the superiors, and that he or she would have to be fully responsible for any own act in office, and would have to follow a specific, formal complaint procedure if in doubt of the legality of an order.[1] Frequently, German politicians had to leave office on the allegation that they took wrong decisions because of their resistance to inferior experts' opinions (this commonly being called to be “beratungsresistent", or resistant to consultation, in German). The historical foundation of this approach lies with the fact that, in the 19th century, many politicians used to be noblemen without appropriate education, who more and more became forced to rely on consultation of educated experts, which (in particular after the Prussian reforms of Stein and Hardenberg) enjoyed the status of financially and personally independent, indismissable, and neutral experts as Beamte (public servants under public law).

A similar approach can be found in British police laws, where entitlements of police constables are vested in the constable in person and not in the police as an administrative agency, this leading to the single constable being fully responsible for his or her own acts in office, in particular their legality. The experience of two dictatorships in the country and, after the end of such regimes, emerging calls for the legal responsibility of the “aidees of the aidees" (Helfershelfer) of such regimes also furnished calls for the principle of personal responsibility of any expert for any decision made, this leading to a strengthening of the bottom-up approach, which requires maximum responsibility of the superiors.

In the opposite, the French administration is based on a top-down approach, where regular public servants enjoy no other task than simply to execute decisions made by their superiors. As those superiors also require consultation, this consultation is provided by members of a cabinet, which is distinctive from the regular ministry staff in terms of staff and organization. Those members who are not members of the cabinet are not entitled to make any suggestions or to take any decisions of political dimension.

The advantage of the bottom-up approach is the great level of expertise provided, combined with the motivating experience of any member of the administration to be responsible and finally the independent “engine" of progress in that field of personal responsibility. A disadvantage is the lack of democratic control and transparency, this leading, from a democratic viewpoint, to the deferment of actual power of policy-making to faceless, if even unknown, public servants. Even the fact that certain politicians might “provide their face" to the actual decisions of their inferiors might not mitigate this effect, but rather strong parliamentary rights of control and influence in legislative procedures (as they do exist in the example of Germany).

The advantage of the top-bottom principle is that political and administrative responsibilities are clearly distinguished from each other, and that responsibility for political failures can be clearly identified with the relevant office holder. Disadvantages are that the system triggers demotivation of inferiors, who know that their ideas to innovative approaches might not be welcome just because of their position, and that the decision-makers cannot make use of the full range of expertise which their inferiors will have collected.

Administrations in dictatorships traditionally work according to a strict top-down approach. As civil servants below the level of the political leadership are discouraged from making suggestions, they use to suffer from the lack of expertise which could be provided by the inferiors, which regularly leads to a breakdown of the system after an few decades. Modern communist states, which the People's Republic of China forms an example of, therefore prefer to define a framework of permissible, or even encouraged, criticism and self-determination by inferiors, which would not affect the major state doctrine, but allows the use of professional and expertise-driven knowledge and the use of it for the decision-making persons in office.

Public health

Both top-down and bottom-up approaches exist in public health. There are many examples of top-down programs, often run by governments or large inter-governmental organizations (IGOs); many of these are disease-specific or issue-specific, such as HIV control or Smallpox Eradication. Examples of bottom-up programs include many small NGOs set up to improve local access to healthcare. However, a lot of programs seek to combine both approaches; for instance, guinea worm eradication, a single-disease international program currently run by the Carter Center has involved the training of many local volunteers, boosting bottom-up capacity, as have international programs for hygiene, sanitation, and access to primary health-care.

Architectural

Often, the École des Beaux-Arts school of design is said to have primarily promoted top-down design because it taught that an architectural design should begin with a parti, a basic plan drawing of the overall project.

By contrast, the Bauhaus focused on bottom-up design. This method manifested itself in the study of translating small-scale organizational systems to a larger, more architectural scale (as with the woodpanel carving and furniture design).

Ecological

In ecology, top down control refers to when a top predator controls the structure or population dynamics of the ecosystem. The classic example is of kelp forest ecosystems. In such ecosystems, sea otters are a keystone predator. They prey on urchins which in turn eat kelp. When otters are removed, urchin populations grow and reduce the kelp forest creating urchin barrens. In other words, such ecosystems are not controlled by productivity of the kelp but rather a top predator.

Bottom up control in ecosystems refers to ecosystems in which the nutrient supply and productivity and type of primary producers (plants and phytoplankton) control the ecosystem structure. An example would be how plankton populations are controlled by the availability of nutrients. Plankton populations tend to be higher and more complex in areas where upwelling brings nutrients to the surface.

There are many different examples of these concepts. It is not uncommon for populations to be influenced by both types of control.

Nanotechnology in water treatment

Nanotechnology, the engineering and art of manipulating matter at the nanoscale (1-100 nm), offers the potential of novel nanomaterials for the treatment of surface water, groundwater and wastewater contaminated by toxic metal ions, organic and inorganic solutes and microorganisms. Due to their unique activity toward recalcitrant contaminants many nanomaterials are under active research and development for use in the treatment of water.[1]

Detection of microbial pathogens

An adequate supply of safe drinking water is one of the major prerequisites for a healthy life, but waterborne diseases is still a major cause of death in many parts of the world, particularly in young children, the elderly, or those with compromised immune systems. As the epidemiology of waterborne diseases is changing, there is a growing global public health concern about new and reemerging infectious diseases that are occurring through a complex interaction of social, economic, evolutionary, and ecological factors. An important challenge is therefore the rapid, specific and sensitive detection of waterborne pathogens. Presently, microbial tests are based essentially on time-consuming culture methods. However, newer enzymatic, immunological and genetic methods are being developed to replace and/or support classical approaches to microbial detection. Moreover, innovations in nanotechnology and nanosciences are having a significant impact in biodiagnostics, where a number of nanoparticle-based assays and nanodevices have been introduced for biomolecular detection.[1][2]

Nanofibers and nanobiocides

Electrospun nanofibers and nanobiocides show potential in the improvement of water filtration membranes. Biofouling of membranes caused by the bacterial load in water reduces the quality of drinking water and has become a major problem. Several studies showed inhibition of these bacteria after exposure to nanofibers with functionalized surfaces. Nanobiocides such as metal nanoparticles and engineered nanomaterials are successfully incorporated into nanofibers showing high antimicrobial activity and stability in water.

Biofilm removal

Sessile communities of bacteria encased in extracellular polymeric substances (EPS) are known as biofilms and causes serious problems in various areas, amongst other, the medical industry, industrial water settings, paper industry and food processing industry.[3] Although various methods of biofilm control exist, these methods are not without limitations and often fail to remove biofilms from surfaces. Biofilms often show reduced susceptibility to antimicrobials or chemicals and chemical by-products may be toxic to the environment, whereas mechanical methods may be labour intensive and expensive due to down-time required to clean the system. This has led to a great interest in the enzymatic degradation of biofilms. Enzymes are highly selective and disrupt the structural stability of the biofilm EPS matrix. Various studies have focused on the enzymatic degradation of polysaccharides and proteins for biofilm detachment since these are the two dominant components of the EPS. Due to the structural role of proteins and polysaccharides in the EPS matrix, a combination of various proteases and polysaccharases may be successful in biofilm removal. The biodegradability and low toxicity of enzymes also make them attractive biofilm control agents. Regardless of all the advantages associated with enzymes, they also suffer from various drawbacks given that they are relatively expensive, show insufficient stability or activity under certain conditions, and cannot be reused. Various approaches are being used to increase the stability of enzymes, including enzyme modification, enzyme immobilization, protein engineering and medium engineering. Although these conventional methods have been used frequently to improve the stability of enzymes, various new techniques, such as self-immobilization of enzymes, the immobilization of enzymes on nano-scale structures and the production of single-enzyme nanoparticles, have been developed. Self-immobilization of enzymes entails the cross-linking of enzyme molecules with each other and yields final preparations consisting of essentially pure proteins and high concentrations of enzyme per unit volume. The activity, stability and efficiency of immobilized enzymes can be improved by reducing the size of the enzyme-carrier. Nano-scale carrier materials allow for high enzyme loading per unit mass, catalytic recycling and a reduced loss of enzyme activity. Furthermore, enzymes can be stabilized by producing single-enzyme nanoparticles consisting of single-enzyme molecules surrounded by a porous organic-inorganic network of less than a few nanometers thick. All these new technologies of enzyme stabilization make enzymes even more attractive alternatives to other biofilm removal and control agents.[1]

Nanofiltration

Nanofiltration is a new type of pressure driven membrane process and used between reverse osmosis and ultrafiltration membranes. The most different speciality of nanofiltration membranes is the higher rejection of multivalent ions than monovalent ions. Nanofiltration membranes are used in softening water, brackish water treatment, industrial wastewater treatment and reuse, product separation in the industry, salt recovery and recently desalination as two pass nanofiltration system.

Reverse Osmosis

The membrane separation technologies of reverse osmosis (hyperfiltration) and nanofiltration are important in water treatment applications. Reverse osmosis is based on the basic principle of osmotic pressure, while nanofiltration makes use of molecule size for separation. Recent advances in the field of nanotechnology are opening a range of possibilities in membrane technologies. These include: new membrane preparation and cleaning methods, new surface and interior modification possibilities, the use of new nanostructured materials, and new characterization techniques.

Electrospinning

Electrospinning is a highly versatile technique that can be used to create ultrafine fibres of various polymers and other materials, with diameters ranging from a few micrometers down to tens of nanometres. The nonwoven webs of fibers formed through this process typically have high specific surface areas, nano-scale pore sizes, high and controllable porosity and extreme flexibility with regard to the materials used and modification of the surface chemistry of the fibres. A combination of these features is utilized in the application of electrospun nanofibres to a variety of water treatment applications, including filtration, solid phase extraction and reactive membranes.

Potential risks on human health

The risk assessment of nanoparticles and nanomaterials is of key importance for the continuous development in the new field of nanotechnology. Humans are increasingly being exposed to nanoparticles and nanomaterials, placing stress on the development and validation of reproducible toxicity tests. Tests currently used include genotoxicity and cytotoxicity tests, and in vivo toxicity models. The unique characteristics of nanoparticles and nanomaterials are responsible for their toxicity and interaction with biological macromolecules within the human body. This may lead to the development of diseases and clinical disorders. A loss in cell viability and structure can also occur in exposed tissues as well as inflammation and granuloma formation. The future of nanotechnology depends on the responsible assessment of nanoparticles and nanomaterials.[1]

Nanotechnology education

Nanotechnology education is being offered by more and more universities around the world.[1] The first program involving nanotechnology was offered by the University of Toronto's Engineering Science program, where nanotechnology could be taken as an option. Generally, nanotechnology education involves a multidisciplinary natural science education with courses in nanotechnology, physics, chemistry, math and molecular biology.

Here is a list of universities offering nanotechnology education, and the degrees offered in nanotechnology, Bachelor of Science in Nanotechnology, Master of Science in Nanotechnology, and PhD in Nanotechnology. This list is not complete.

Latin America

Brazil

Mexico

  • Instituto Nacional de Astrofisica, Optica y Electronica (INAOE) - Masters and PhD[6]
  • Universidad de las Américas - Bachelor (Licenciatura en Nanotecnología e Ingeniería Molecular)

Europe

A list of the Masters programs in Europe is kept by the UK based Institute of Nanotechnology in their Nanotechnology Masters Courses Directory.[7]

Joint Programmes

Belgium

Czech Republic

Denmark

France

Germany

Greece

Israel

Italy

Netherlands

Norway

Poland

Spain

  • DFA.ua.es, Master en Nanociencia y Nanotecnologia Molecular - Masters

Sweden

Switzerland

United Kingdom

Turkey

United States

Australia

New South Wales

Queensland

South Australia

Victoria

Western Australia

New Zealand

Canada

Asia

India

[24]

Hong Kong

Singapore

Thailand

Malaysia

Japan

Nanotechnology in Schools

In addition to several tertiary programs, nanotechnology is being taught as part of science studies at high schools. The Journal of Nanoeducation provides articles on K-12 initiatives.

in2nano is a high school outreach program in Egypt aiming to increase scientific literacy and prepare students for the sweeping changes of nanotechnology.
Latrobe.edu.au, "in2nanotech", a high school outreach program from La Trobe University where students go on a roadshow to increase awareness of nanotechnology to students.