Articulo Sobre La Tabla Periòdica

Embed Size (px)

DESCRIPTION

articulo tabla periodica

Citation preview

  • Published: September 16, 2011

    Copyrightr 2011 American Chemical Society andDivision of Chemical Education, Inc. 1511 dx.doi.org/10.1021/ed100779v | J. Chem. Educ. 2011, 88, 15111514

    ARTICLE

    pubs.acs.org/jchemeduc

    Periodic Table of the Elements in the Perspective of Artificial NeuralNetworksMaurcio R. Lemes*, and Arnaldo Dal Pino

    Faculdade Anhanguera de Taubate, Engenharia, Av. Charles Schnneider, 585, Parque Senhor Bonm, Taubate, S~ao Paulo 12062350,BrazilInstituto Tecnologico de Aeronautica, Prac-aMal-do-Ar EduardoGomes 50 Vila das Acacias, Sao Jose dos Campos, SP 12228-900, Brazil

    In 1869 Mendeleev1 presented the periodic law of the elements

    to the scientic community. Mendeleev knew the existenceand some properties of about 60 elements. For the vast majorityof these elements, his knowledge was restricted to atomic weight,reaction of the element with oxygen, atomic radius, and meltingpoint.2 He had so much condence in his discovery that he leftempty positions in his table. These spaces were dedicated tothose elements that, according to him, would still have to bediscovered. If one takes into consideration the limited informa-tion available, the table developed by Mendeleev deserves thegreatest admiration.

    At that time, scientists knew nothing about the atomicstructure and atomic numbers that are used in the organizationof the elements of the current table. Over 40 years later, in 1913,Mosely established the concept of atomic number.3 This dis-covery, however, provoked only minor rearrangements in theclassication of the elements created byMendeleev. Possibly, thebiggest triumph of the periodic table of the elements was toforesee the existence and properties of unknown elements at itstime. For example, Mendeleev not only claimed the existence ofthe element eka-silicon, today known as germanium, but alsoinferred its properties and reactions with chlorine and oxygenwith considerable precision.

    The periodic table identies similarities between two or moreelements and arranges them under the format of periods andgroups. The intervals in which these similarities repeated con-sistently related to the atomic number. In the table, the elementsare arranged horizontally, in numerical sequence, according totheir atomic numbers, thus, giving rise to the appearance of sevenhorizontal lines (or periods). Each period, with the exception ofthe rst one, starts with a metal and nishes with a noble gas. The

    length of a period diers, ranging from a minimum of 2 elementsto a maximum of 32. The vertical lines are formed by elementswhose external electronic structures are similar. These columnsare called groups. In some of them, the elements are so closelyrelated that they are called families. For example, group 2 is thefamily of alkali earth metals (beryllium, magnesium, calcium,strontium, barium, and radium).

    Such great success of human intelligence yields a fertile eldfor exploring the capacity of articial intelligent systems toproduce similar results. Kohonen networks,4 self-organizedmaps, and other techniques have been commonly used inclassication eorts, such as in silicon clusters, spectrometry,modeling, optimization, chemical problems511 and others.1215

    The goal of this article is to investigate the capacity of anintelligent articial system to classify chemical elements. To thisend, a Kohonen network (KN) is supplied with the informationknown by the end of the 19th century. The KN is, therefore, fedwith similar knowledge that was available to Mendeleev. Weshow that the 8 8 KN places the elements in such a way that itobeys many properties presented in the original periodic table.Such a fact reinforces the eciency of the method. We also showthat some elements are so similar that they share the same cell.

    KOHONEN NEURAL NETWORK

    Neural networks were originally developed16 in the 1940s bythe neurophysiologist Warren McCulloch of MassachusettsInstitute of Technology and the mathematician Walter Pitts ofthe University of Illinois. They proposed a simple model of the

    ABSTRACT: Although several chemical elements were not known by end of the 19thcentury, Mendeleev came up with an astonishing achievement, the periodic table ofelements. He was not only able to predict the existence of (then) new elements, but alsoto provide accurate estimates of their chemical and physical properties. This is a profoundexample of the human intelligence. Here, we try to shed some light on the followingquestion: Can an articial intelligence system yield a classication of the elements thatresembles, in some sense, the periodic table? To achieve our goal, we have used a self-organized map (SOM) with information available at Mendeleevs time. Our results showthat similar elements tend to form individual clusters. Thus, although SOM generatesclusters of halogens, alkaline metals, and transition metals that show a similarity with theperiodic table of elements, the SOM did not achieve the sophistication that Mendeleev achieved.

    KEYWORDS:General Public, Graduate Education/Research, Interdisciplinary/Multidisciplinary, Physical Chemistry, Computer-Based Learning, Atomic Properties/Structure, Chemometrics, Periodicity/Periodic Table, Physical Properties

  • 1512 dx.doi.org/10.1021/ed100779v |J. Chem. Educ. 2011, 88, 15111514

    Journal of Chemical Education ARTICLE

    neuron that revealed itself as a powerful computing device andproved that a synchronized arrangement of these neurons iscapable, in principle, of universal computation. Thus, an articialneural network can perform any calculation that an ordinary thatis based on the human brain.

    An articial neural network (ANN) is composed of severalprocessing units whose individual functioning is simple. Theseunits are connected by communication channels that are asso-ciated with certain weights. The units operate on their local data,which are entries received by their connections. The intelligentbehavior of an ANN is a global eect explained by interactionsbetween the processing units of the network. There are two typesof ANN according to the learning scheme: supervised andunsupervised. In this work, a well-known type of unsupervisedlearning network called Kohonen network (KN)17 is used.

    The KNs are formed by a set of simple elements organized inmore complex structures that work together. Each neuron is aprocessing unit that receives stimulus (from outside the systemor from other neurons) and produces a response (to otherneurons or outside the system). Similar to the structure of thebrain, the neurons of the neural networks are interconnected bybranches through which the stimuli are propagated. The learningprocess consists of strengthening the links that lead the system toproduce more ecient responses. The goal of a KN is to mapinput patterns of arbitrary dimension N for a discrete geometricarrangement of two dimensions (Figure 1). What distinguishesthe Kohonen networks from others is a double-layered structure:one layer for input and another for processing, where the map isformed. The processing layer consists of a geometric arrange-ment of neurons connected only to their immediate neighbors.

    The objects to be grouped for subsequent segmentation (forexample, the chemical elements) are presented, one at a time, tothe input neurons. At each presentation, the stimuli generated bythe object (for example, atomic weight, atomic radius, density,temperature, fusion, etc.) are captured by the input layer andtransmitted equally to all the neurons of the layer of the map. Inthe network, the neuron that responds more strongly to thestimuli of the presented object wins it for itself. Furthermore, itreinforces its links with its neighbors, making them moresensitive to the characteristics of the captured object.

    The presentation of all input objects to the neural network andthe update of the weight for each item is termed the epoch. In anew epoch, when an object is presented to the map, the entiresensitized region will react more intensely. However, as theneighboring neurons are dierent from the winning neuron, eachwill react more intensely to a slightly dierent object. With eachnew presentation of an object to the map, the sensitivity prole ofthe neurons changes. This is what is termed training thenetwork (Figure 2). These changes, however, become smallereach time, so that the conguration of the map converges to astable arrangement. When this happens, the map has learned toclassify individuals.

    The result of processing a trained network is that each neuronbecomes the owner of a number of objects (Figure 3) similar tothose captured by neighboring neurons. Thus, similar individualsget placed near each other, forming a gradient of characteristics.The KNs may be referred to as self-organized maps. They are anexample of unsupervised learning networks.

    Figure 1. Schematic of a Kohonen network; input data are representedby the black circles with the solid lines representing possible pathways tothe network and the processing is represented by the dotted lines on the5 4 grid.

    Figure 2. Uppercase and lowercase letters, vowels and consonantsmake up the group to be classied. The neuron in the KN that respondsmore strongly to an object wins it (neighbor cells are aected).

    Figure 3. The neuron with the strongest response captures the letter.On the left, the rst letter (E) has been previously captured and the letterB is being captured. On the right, the group of letters has been organized;note that the uppercase and lowercase letters are near to each other andvowels and constants are near to each other.

    Table 1. The 69 Elements Used in This Worka

    Elementsa

    H Ca* Y* Mg* Ga* Nb* I

    Li Zn* In* Ce* N* Sb Fe

    Na* Sr* La* Hf P* Ta Pt

    K* Cd* Er Pb V* Bi Ni*

    Cu* Ba Tl Th As* F* Cu*

    Rb* Hg C O* Mo* Cl* Os

    Ag* Be Si S* Te* Mn* Pd*

    Cs B Ti* Cr* W Br* Ir

    Au Al* Zr* Se* U Ru* Ag*

    Be Sc* Sn* Tc Co Rh*aThe asterisk (*) denotes the 41 elements chosen for the training ofthe KN.

  • 1513 dx.doi.org/10.1021/ed100779v |J. Chem. Educ. 2011, 88, 15111514

    Journal of Chemical Education ARTICLE

    The learning process uses a set of known elements and theirproperties to determine the optimal values for the connectionsbetween neurons represented by the weights, w. Mathematically,the learning process of a KN may be described by

    wijk 1 wijk wijk

    wijk edl, ji wijk 1

    where is called learning rate, is the neighborhood factor (thehigher the value of , the less the neighborhood will be aected),i represents the ith training property, wij are the weights to betrained, l and j are indexes that characterize the cells, and d(l,j) isthe distance between cells l and j. The weights are initialized fromrandom values and are submitted to training. The process isiterative; that is, the weights obtained in iteration k + 1 arecalculated from the values of the iteration k, until the valuesw(k +1) and w(k) remain essentially unchanged. Each of these itera-tions is the epoch. It is important to note that the KN possessesperiodic boundary condition. The values of i are normalized;that is, upon entering the network, they become values between 0and 1. This is done to ensure uniformity of the input data,

    because various properties with dierent orders of magnitudeare used.

    TRAINING AND PREDICTION

    For the training of the KN, the following properties, known toMendeleev, were used: atomic weight, radius of connection,atomic radius, melting point, and reaction with oxygen. After thetraining, an investigation of the behavior of properties dierentfrom those that were trained was conducted. These propertieswere the boiling point, atomic number, ionization potential,electronegativity, and density. The KN was able to map thefeatures that were not part of the training. A list of the 69elements studied in this work is given in Table 1. Among theelements for training, 41 chemical elements were randomlychosen and, for the training of the networks, the 5 neuralproperties previously identied were used.

    For convenience, a KN of square architecture, whose sideswere composed of 8 neurons, was used. Training was conductedin 5000 epochs for all tests. Through the systematic variation ofthe parameters of learning (0.04 e e 0.2) and of neighbor-hood (0.7e e 1.5), it was found that the 8 8 network withthe highest number of cells lled with a single element wasobtained when = 1.4 and = 0.09.

    RESULTS

    The KN after the training process is presented in Table 2. Byinspecting Table 2, it can be seen that the KN recognized andgrouped elements with high electronegativity. The elementsuorine, chlorine, bromine, oxygen, and nitrogen occupy neigh-boring cells. The transition metals were also grouped: silver andpalladium; nickel and copper; manganese (Mn), chromium (Cr),vanadium(V), and titanium (Ti). There were groupings of alkalimetals such as rubidium (Rb), potassium (K), and sodium (Na).Another line group that formed was potassium (K), calcium(Ca), and scandium (Sc). There was also a lineup of strontium(Sr), yttrium (Y), and zirconium (Zr). From the 5A group,phosphorus (P) and nitrogen (N) were grouped.

    Using the trained weights, the cells occupied by the elementserbium (Er), platinum (Pt), gold (Au), and hydrogen (H) wereidentied and added to Table 2. This result is shown in Table 3.Note the proposed position in the KNs, placing Er and Cetogether, hydrogen in the same cell as the uoride, and platinumtogether with gold. Compared to the current periodic table, it isnoted that the erbium and cerium, which occupy the same cell,are lanthanides. Platinum and gold, which are metals, are close to

    Table 3. Map with Predictions Using the Trained Weights

    In (4) La(1) Sr (2) Rb (3) K (3) Na (3) Mg (2)

    Sn (4) Ce (5) Er Y (1) Ca (2) Te (6) Zr (1) Sc (1) Al (4) P (6) N (6)

    Ag (1) Pt Au Mo (1) Ti (1) O (6)Pd (1) Ru (1) V (1) F H Cr (1) S (6) Cl (6)

    Ni (1) Mn (1) Br (6)

    Cu (1) Ag (1) Zn (1) Ga (4) As (6) Se (6)

    Table 2. Map Founda

    In (4) La (1) Sr (2) Rb (3) K (3) Na (3) Mg (2)

    Sn (4) Ce (5) Y (1) Ca (2)

    Te (6) Zr (1) Sc (1) Al (4) P (6) N (6)

    Ag (1) Mo (1) Ti (1) O (6)

    Pd (1) Ru (1) V (1) F (6)

    Cr (1) S (6) Cl (6)

    Ni (1) Mn (1) Br (6)

    Cu (1) Ag (1) Zn (1) Ga (4) As (6) Se (6)aThe numbers refer to transition metals (1), alkaline earth metals (2),alkali metal (3), other metals (4), lanthanides (5), and nonmetals (6).

    Table 4. Element Properties, Actual Values, and Normalized Values Used for the Training

    Atomic Weight/amu Covalent Radius/ Atomic Radius/ Melting Point/K Specic Heat/(J g1 C1) Reaction with O2

    Element Actual Normalized Actual Normalized Actual Normalized Actual Normalized Actual Normalized Actual Normalized

    Nb 92.91 0.45 1.34 0.55 2.08 0.59 2740 0.70 0.26 0.11 2.5 0.61

    Mo 95.94 0.46 1.3 0.53 2.01 0.57 2890 0.73 0.25 0.11 3 0.74

    Cd 112.41 0.52 1.48 0.61 1.71 0.47 594.18 0.23 0.23 0.11 1.2 0.23

    In 114.82 0.53 1.44 0.60 2 0.56 429.76 0.19 0.23 0.11 1.5 0.36

    Cu 63.546 0.34 1.17 0.48 1.57 0.42 1357.6 0.40 0.38 0.12 0.5 104 0.1Ag 107.868 0.51 1.34 0.55 1.75 0.48 1234 0.37 0.235 0.11 0.5 104 0.1Rh 102.9 0.49 1.25 0.51 1.83 0.51 2236 0.60 0.242 0.11 4 1

    Pd 106.4 0.50 1.28 0.52 1.79 0.50 1825 0.50 0.24 0.11 4 1

  • 1514 dx.doi.org/10.1021/ed100779v |J. Chem. Educ. 2011, 88, 15111514

    Journal of Chemical Education ARTICLE

    silver, ruthenium, and molybdenum and hydrogen was predictednext to uoride, which is also a nonmetal.

    The nal results starting from Table 2, show 33 cellsoccupied by one element and 4 cells occupied by 2 elements.(This table is not shown.) Cadmium (Cd) and indium (In),copper (Cu) and silver (Ag), rhodium (Rh) and palladium(Pd), niobium (Nb) and molybdenum (Mo) occupy the samecell. The properties used for training the neuron and theelements that shared the same cell are presented in Table 4.The pair, niobium and molybdenum, presents all the un-trained properties with similar values. The pair cadmiumand indium has similar atomic weight, connection, as well asatomic radii, and diers in only 20% at the melting point. Asimilar situation occurs for the pair rhodium and palladium.The pair copper and silver has dierent atomic weight, but theother untrained properties are similar. The network thereforeshows that the atomic weight is not the most important featurefor classifying of elements.

    Some properties not used in training are presented in Table 5.The pair cadmium and indium has similar atomic number andelectronegativity. The pair copper and silver features dierentatomic numbers, but the other untrained properties are similar.The pair rhodium and palladium has dierent ionization poten-tial, but the other properties are similar. The pair niobium andmolybdenum has dierent densities, but other untrained proper-ties are similar.

    CONCLUSIONS

    Using information known at the time of Mendeleev, anarticial intelligent system was tested to classify chemical ele-ments. The KNs were able to map the chemical elements and toorganize them according to various trained as well as untrainedproperties. The KNs organized alkali metals, transition metals,and even properties that were not present during training, forinstance, electronegativity. Using the 8 8 architecture, thesystem was ecient and managed to map many dierent aspectsof the elements. However, some chemical elements occupiedthe same cell because they had similar general properties.

    AUTHOR INFORMATION

    Corresponding Author*E-mail: [email protected].

    REFERENCES

    (1) Mendeleev, D. The Relation between the Properties and AtomicWeights of the Elements. J. Russ. Chem. Soc. 1869, 1, 6077.

    (2) Mendeleev, D. Z. Chem. 1869, 12, 405.(3) Moseley, H. G. J. The High Frequency Spectra of the Elements.

    Phil. Mag. 1913, 1024.(4) Kohonen, T. Self-organized formation of topologically correct

    feature maps. Biological Cybernetics 1982, 43, 5969.(5) Vander Heyden, Y.; Vankeerberghen, P.; Novic, M.; Zupan, J.;

    Massart, D. L. The application of Kohonen neural networks to diagnosecalibration problems in atomic absorption spectrometry. Talanta 2000,51 (455466), 2000.

    (6) Tusar, M.; Zupan, J.; Gasteiger, J. J. Chem. Phys. 1992, 89, 1517.(7) Favata, F.; Walker, R. Biological Cybernetics 1991, 64, 463.(8) Lemes, M. R.; Pino, A. D., Jr. Quim. Nova 2002, 25, 539.(9) Lemes, M. R.; Marim, L. R.; Pino, A. D., Jr. Phys. Rev. A 2002,

    66, 23203.(10) Zupan, J.; Gasteige, J. Anal. Chim. Acta 1991, 1, 248.(11) Zupan, J.; Gasteige, J.Neural Networks for Chemists: VCH:New

    York, 1993.(12) Lambert, J. M. Proceedings of the 5th ICNN, 1991.(13) Mhaskar, H. N.; Hahm, N. Neural Computation 1997, 9, 144.(14) Suzuki, Y. Self-organizing QRS-Wave recognition in ECG using

    neural networks. IEEE Trans. Neural Networks 1995, 14691477.(15) Haykin, S.; Li, L., 16 Kbs adaptive dierential PCM of speech.

    In Applications of Neural Networks to Telecommunications; Allspector, J.,Goodman, R., Brown, T. X., Eds.; Laurence Elbaum Associates: Hillsdale,NJ, 1993.

    (16) McCulloch, W.; Pitts, W. A Logical Calculus of the IdeasImmanent in Nervous Activity. Bull. Math. Biophys. 1943, 5, 115133.

    (17) Kohonen, T. Self-Organizing and Associative Memory, 3rd ed.;Springer Verlag: Berlin, 1989.

    Table 5. Element Properties Not Used in Training

    Element

    Atomic

    Number

    Ionization

    Potential/V

    Electronegativity/

    eV

    Boiling

    Point/K

    Density/

    (g/cm3)

    Cd 48 8.993 1.69 1040 8.65

    In 49 5.786 1.78 2346 7.31

    Cu 29 7.726 1.9 2836 8.96

    Ag 47 7.576 1.93 2436 10.5

    Rh 45 7.46 2.28 3970 12.4

    Pd 46 8.34 2.2 3237 12

    Nb 41 6.88 1.6 5017 8.55

    Mo 42 7.099 2.16 4912 10.2