Mostrando las entradas con la etiqueta Periodico LaTekhné. Mostrar todas las entradas
Mostrando las entradas con la etiqueta Periodico LaTekhné. Mostrar todas las entradas

lunes, 3 de enero de 2022

El futuro de la profesión médica

 El futuro de la profesión médica

 

Julián Alberto Uribe Gómez

Docente facultad de Ciencias Económicas y Administrativas

 

Juan Guillermo Barrientos Gómez

Director científico Clínica Universitaria Bolivariana

 

Mucho se ha hablado del desarrollo de la medicina en los últimos años. La medicina tradicional cada vez más, esta siendo complementada por nuevos desarrollos y tecnologías asociadas, es así como ingenieros, programadores,  analistas, científicos de datos y gestores tecnológicos, están desarrollando productos de alta innovación en muchos campos asociados con la medicina. La tecnología y la computación es ahora más accesible y económica , hay mayor conectividad que hace 10 años y el mundo es optimista en cuanto a la mejora y desarrollo en esta área. Así, se encontró en una encuesta realizada a 1156 adolescentes de Estados Unidos que una de las fuentes principales de consulta para información medica es internet y tecnologías asociadas, tal como se muestra en la imagen 1.


Existe cada vez más confianza en el uso de la tecnología y los dispositivos asociados para diagnosticar, prevenir y tratar enfermedades, esto y la tendencia hacia la miniaturización, inteligencia artificial y economía será fundamental en el desarrollo de una medicina más proactiva y menos reactiva, es por esta razón que innumerables industrias (solo por nombrar algunas) han incursionado en aspectos de desarrollo de dispositivos y herramientas de apoyo en la medicina.  En la tabla 1 se puede ver la relación entre la tecnología desarrollada y la descripción aplicativa.


Tecnología

Empresa

Descripción

Link

Minirrobots de limpieza de arterias

Microbot Medical

Robot autónomo controlado remotamente para limpieza de arterias y venas.

https://microbotmedical.com/virob/

Robot Cirujano Renaissance

Mazor Robotics

Robot de operaciones de columna vertebral.

https://www.mazorrobotics.com/en-us/mazor-core-technology/core-tech

Robot cirujano Da vinci

Intuitive Surgical

Robot cirujano utilizado en operaciones cardiovasculares, ginecológicas y urinarias.

https://www.intuitive.com/

IBM Watson para medicina

IBM

Programa de inteligencia artificial para diagnóstico de enfermedades en pacientes.

https://www.ibm.com/watson-health

Wearables y pulsera fitbit

Fitbit-Apple

Relojes, pulseras y anillos para medir pulsaciones, pasos y monitorear el sueño.

https://www.fitbit.com/es/home

Tricoder

En desarrollo

Laboratorio portátil de medición de signos vitales y diagnóstico de enfermedades comunes.

Patrocinado por https://tricorder.xprize.org/prizes/tricorder

Shockables

En desarrollo

Relojes y pulseras con sensores que avisaran mediante leves choques eléctricos, cuando se fume o se tengan hábitos no saludables para dejar de hacerlo.

 

Trainables

En desarrollo

Sensores con funciones de entrenamiento.

 

Insideables

En desarrollo

Sensores que se pueden llevar a interior de nuestros cuerpos. Por ejemplo lentes de contacto desarrollados por Google que ayudan a medir los niveles de glucosa.

https://www.washingtonpost.com/business/technology/googles-smart-contact-lens-what-it-does-and-how-it-works/2014/01/17/96b938ec-7f80-11e3-93c1-0e888170b723_story.html

GoogleLeNet

Google

Inteligencia artificial de Google destinada a interpretar imágenes de patología

https://research.google/pubs/pub43022/

Asistentes virtuales Siri, Alexa, Cortana o chatbots 

Amazon-Apple-Google

Asistentes robóticos de consulta de síntomas y diagnóstico

 

Servicios de Telemedicina

Apoyado por plataformas con Skype o teléfono celular.

Consultas médicas virtuales. Permite hablar con médicos certificados a cualquier hora, en cualquier lugar, sin esperas.

https://www.teladochealth.com/en/


En general, la disrupción en la medicina esta sucediendo en 5 aspectos principales: miniaturización, prótesis y biónica, realidad aumentada, tecnologías “weareables” e impresión 3D (Díaz, 2016). Al respecto de esta ultima, se puede considerar que un cardiólogo utilice muchas de estas herramientas asociadas con la ingeniería y puedan recetar marcapasos personalizados y fabricarlos individualmente con impresoras 3D según las necesidades, adicional a esto se considera que los cirujanos, con el avance en tecnologías robóticas, tengan en algunos casos que programar robots, lo cual requerirá de experiencia en lógica, programación e ingeniería. En conclusión, La profesión ejercida por los médicos y la medicina en general cambiara en los próximos años hacia actividades mayormente predictivas, proactivas, personalizadas y colaborativas, donde la labor del médico evolucionara hacia campos mayormente de acompañamiento e interpretación de resultados generados por estas nuevas tecnologías emergentes.

 

Referencias

 

Díaz, J. (2016). 5 Tecnologías que van a revolucionar el mundo de la medicina.

MarketingCharts. (2015). US teens’ top sources of health information.

martes, 17 de diciembre de 2019

Aplicación de la mineria de datos para la toma de decisiones en procesos de fabricación de productos farmacéuticos

Tomado de: https://www.itm.edu.co/wp-content/uploads/la-tekhne/2019/PDF-La-Tekhne-No.-106-Diciembre-de-2019-3_compressed.pdf
APPLICATION OF DATA MINING FOR DECISION MAKING IN PROCESSES OF MANUFACTURE OF PHARMACEUTICAL PRODUCTS

Stefany Paola Tirado De Stefano
Julián Alberto Uribe Gómez

At present, due to the massive increase in the amount of data that must be collected and analyzed in business environments, data mining methodologies arise, which constitute techniques that allow knowledge to be extracted from massive data sources, detecting opportunities for optimization in decision making. In this way, we seek to take advantage of the data obtained during the production process of a pharmaceutical company, focused on manufacturing physiological sera and intravenous solutions.
For this study, 2724 data were available, which contained information on product identification, production lines, production lots, lot size, type of defect found, number of defects presented and the stage in which they were detected. This is how the specific objectives with this study are: describe which line has the greatest number of defects, observe if there is an association between the production line and the type of defect found, define which of the stages of the process are the greater number of defects and identify those types of defects that are most likely to occur, whose general objective aims to design strategies that help improve the detection of anomalies within the production process. To achieve this, two strategies are proposed:
1. Classify the data by their characteristics, generating several groups within which only those data that have the required aspects are received, otherwise they are excluded and must continue to be attempted until they are admitted in one of them, this technique within the data mining It is called clustering.
2. Describe a relationship between the variables that contain the studied data, so that a production line can be associated with a type of defect, this type of analysis is known as association rules.
The procedure is as follows:
1. Prepare the data.
2. Perform cleaning and transformation process. It was determined that the variable Identification and Batch are not relevant for the analysis, so they are eliminated.
3. Make box-plot graphs of the numerical variables Size and Quantity of defects, to better demonstrate the distribution of the data, allowing to recognize the existence of outliers, which were subsequently eliminated (see figure 1).
4. Identify characteristics of the categorical variables, in this step, it was possible to define that line 6 is the one with the greatest number of defects. On the other hand, the most common types of defects are: cap particle and heat sealing bad, which belong to the review stage.
5. To observe the characteristics that lead to the generation of defects, an analysis by clustering is initially carried out, determining 8 groups (or clusters) to be created, using the k-Means method, whose purpose is to assign to each point (row) one of the k groups based on their characteristics and the distance of the point from the center (see table 1).
To perform this grouping, the Knime software was used, a data mining platform that allows developing models in a visual environment (see figure 2). The generated clusters are evaluated using the Davies-Bouldin (DB) index that indicates how compact the clusters are, for this case, the DB index shows a value of 0.348, which means that the clusters created internally have good cohesion. Similarly, the Silhouette index is used, which is used to evaluate both the cohesion and the separability of the clusters, in this case this index has a value of 0.706, which indicates a good grouping of the records.
The created clusters allow to know the characteristics of the data that are contained in them, as it is the case of cluster 0 which contains observations of size of 4641.7 units, which on average have 6.79 defective units coming from line 3, which are presented a deformed cover defect that was detected in the review period.
To evaluate the association rules, the Apriori algorithm is used, which seeks to reduce the number of candidates for the association. This technique is performed using the Python programming language that allowed to create a model with 23 rules related to lines 3, 5 and 6. One of the association rules that resulted from this process states that when it is manufactured on line 6 and initially worked with a very large lot size, there is a 79% chance of detecting a heat seal defect.
In conclusion, data mining allows describing or explaining facts from a data set. In this case a pharmaceutical company, where the main failures are generated in the operation of the machine, due to this it is important that the company constantly monitors and executes preventive maintenance. Likewise, it is recommended to make adjustments to the pressure exerted by the machine to form and unify the lid to the bag containing the solution, since from cluster 0 it was possible to know that the defect of the deformed lid was very common within the production of line 3, so it is important to evaluate the current state of the machines of this line.

_____________________________________________________________________________________

Aplicación de la minería de datos para la toma de decisiones en procesos de fabricación de productos farmacéuticos

En la actualidad, debido al incremento masivo de la cantidad de datos que deben ser recogidos y analizados en entornos empresariales, surgen metodologías de minería de datos, las cuales constituyen técnicas que permiten extraer conocimiento a partir de fuentes masivas de datos, detectando oportunidades de optimización en la toma de decisiones. De esta manera, se busca aprovechar los datos obtenidos durante el proceso de producción de una empresa farmacéutica, enfocada en fabricar sueros fisiológicos y soluciones intravenosas.

Para este estudio se disponían de 2.724 datos, que contenían información sobre identificación del producto, líneas de producción, lotes de producción, tamaño del lote, tipo de defecto encontrado, cantidad de defectos presentados y la etapa en la que fueron detectados. Es así como los objetivos específicos con este estudio son: describir cuál línea es la que presenta mayor cantidad de defectos, observar si existe una asociación entre la línea de producción y el tipo de defecto encontrado, definir cuál de las etapas del proceso se encuentran la mayor cantidad de defectos e identificar aquellos tipos de defectos que son más probables a presentarse, cuyo objetivo general apunta a diseñar estrategias que ayuden a mejorar la detec- ción de anomalías dentro del proceso de producción.

1. Clasificar los datos por sus características, generando varios grupos dentro de los cuales son recibidos solo aquellos datos que poseen los aspectos requeridos, de lo contrario son excluidos y deben seguir intentado hasta ser admitidos en uno de ellos, esta técnica dentro de la minería de datos lleva por nombre clustering.

2.Describir una relación entre las variables que contienen los datos estudiados, de forma que se pueda asociar una línea de producción con un tipo de defecto, este tipo de análisis se conoce como reglas de asociación.

El procedimiento planteado es el siguiente

Preparar los datos.

Realizar proceso de limpieza y transformación. Se determinó que la variable Identificación y Lote no son relevantes para el análisis, por lo que son eliminadas.

Realizar gráficos boxplot de las variables numéricas Tamaño y Cantidad de defectos, para evidenciar mejor la distribución de los datos, permitiendo reconocer la existencia de valores atípicos, que posteriormente fueron eliminados (ver figura 1).

Identificar características de las variables categóricas, en este pasó, se logró definir que la línea 6 es la que mayor cantidad de defectos presenta. Por otra parte, los tipos de defectos que más se repiten son: partícula tapa y el mal termosellado, que pertenecen a la etapa de revisión.

Observar las características que conducen a la genera- ción de defectos, se realiza inicialmente un análisis por clustering, determinando 8 grupos (o clústeres) a crear, utilizando el método k-Means, cuyo propósito es asignar a cada punto (fila) uno de los k grupos basados en sus características y la distancia del punto con respecto al centro (ver tabla 1).

Para realizar este agrupamiento se utilizó el software Knime, plataforma de minería de datos que permite de- sarrollar modelos en un entorno visual (ver figura 2). Los clústeres generados se evalúan utilizando el índice Davies- Bouldin (DB) que indica lo compactos que están los clústeres, para este caso, el índice DB arroja un valor de 0.348, lo que significa que los clústeres creados presentan internamente una buena cohesión. De igual forma se recurre al índice Silhouette, que es utilizado para evaluar tanto la co- hesión como la separabilidad de los clústeres, en este caso este índice tiene un valor de 0.706, lo que indica una buena agrupación de los registros.

Los clústeres creados permiten conocer las característi- cas de los datos que están contenidos en ellos, como lo es el caso del clúster 0 el cual contiene observaciones de tamaño de 4641.7 unidades, que en promedio tienen 6.79 unidades defectuosas provenientes de la línea 3, que vie- nen presentado un defecto de tapa deforme que fue detec- tado en el periodo de revisión.

Para evaluar las reglas de asociación, se utiliza el algo- ritmo Apriori, el cual busca la reducción del número de candidatos para la asociación. Esta técnica se realiza utilizando el lenguaje de programación Python que permitió crear un modelo con 23 reglas relacionadas con las líneas 3, 5 y 6. Una de las reglas de asociación que resulto de este proceso afirma que cuando se fabrica en la línea 6 y se trabajó inicialmente con un tamaño de lote muy grande, se tiene una probabilidad del 79% de detectar un defecto por termo-sellado.

En conclusión, la minería de datos permite describir o explicar hechos a partir de un conjunto de datos. En este caso una empresa farmacéutica, donde las principales fallas se generan en la operación de la máquina, es importante que la empresa controle y ejecute mantenimientos preventivos constantemente. De igual forma, se recomienda realizar ajustes a la presión ejercida por la máquina para formar y unificar la tapa a la bolsa que contiene la solución, ya que a partir del clúster 0 se logró conocer que el defecto de tapa deforme era muy común dentro de la producción de la línea 3, por lo que es importante evaluar el estado actual de las máquinas de esta línea.

Tomado de: https://www.itm.edu.co/wp-content/uploads/la-tekhne/2019/PDF-La-Tekhne-No.-106-Diciembre-de-2019-3_compressed.pdf







Modelos de halving para bitcoin: un acercamiento desde la simulación


Halving models for bitcoin: an approach from simulation

Julián Alberto Uribe Gómez

Bitcoin has been the subject of debate, multiple opinions and controversies since its creation as a cryptocurrency 10 years ago, not only because of its attractiveness as a digital asset with high returns but also for its uses, which include, among other illicit activities.
Bitcoin is the first and main currency created by a group or person called Satoshi Nakamoto, which aimed to develop a way to make online P2P payments without the need for intermediaries such as financial institutions (Nakamoto, 2008), likewise, to demonstrate the creation of value without relying on central organizations.
Bitcoins like any financial asset must be generated, as in the case of gold, that is, it must be exploited and subsequently negotiated in supply and demand processes in a specific market to obtain value, so this process of generating supply in the market Cryptoactive is known as mining, the only way in which new cryptocurrencies can be generated, this is done by solving increasingly complex mathematical problems, where thousands of mining nodes around the world compete to obtain new bitcoins.
Initially, the bitcoin protocol created by Satoshi Nakamoto included a total production or fixed offer of 21 million cryptocurrencies, this with the sole purpose of avoiding asset inflation, much of the attractiveness of bitcoin lies in the promise of convertrise into a product scarce and with a value that climbed up to 20 thousand dollars per bitcoin at the end of 2017, so chasing a deflationary model will generate its value gradually. The intensive mining of this asset and the imposed production protocol has generated a reduction process, as shown in Figure 1, a little more than 17 million bitcoins have been produced and mined, approximately 83% of the total bitcoins.
This process of reducing bitcoin mining is known as halving, which is an automated process of halving bitcoins delivered to miners as a reward for the creation of new blocks, so every 210000 blocks mined or which is equivalent to approximately every 4 years, the amount of bitcoins delivered per block is reduced, so, in 2009 the process began delivering 50 bitcoins per block, then 25 bitcoins per block, currently, 12.5 bitcoins are being mined per block, until the quota of zero bitcoins per block is reached. Estimates of when this will happen will be in the year 2140.
From the system dynamics modeler for the NETLOGO platform, some approaches to the behavior of halving or reduction processes for bitcoin can be explored, illustration 2 presents an approach mostly calculated according to the ideal behavior of the bitcoin reduction system, where As mentioned before every 4 years the number of bitcoins is reduced by half.
The behavior of these models is known as exponential decay and goal search, both behaviors can be observed in illustration 3. It is observed that theoretically through the model, in halving 34 0 bitcoins will be rewarded, time in which the offer of The cryptocurrency will have stopped.
However, this reduction process also depends on additional factors to operate, it is also a more comprehensive process within the cryptocurrency system. Mining bitcoins requires suitable locations to carry out this process, mining nodes and even equipment with specific characteristics to create the bitcoins and obtain the reward. A halving process that includes more comprehensive features can be proposed as seen in illustration 4.
The result of this model, which is represented in Figure 5, resembles the behavior observed in Figure 1, where it begins with a smooth, less steep growth and not with a slope so steep in the amount of bitcoins mined, where the total is gradually reached.
Finally, bitcoin and in general all the cryptocurrencies developed to date have managed to focus a precedent and change the way of seeing the economy, through decentralized proposals. The cryptocurrencies operate and will continue to operate because their innovative, technical and technological development has been gaining strength, all this has developed a market and the exploration of new possibilities.


El diseño experimental aplicado en procesos de enrollamiento filamentario para industrias de materiales compuestos

Tomado de:https://www.itm.edu.co/wp-content/uploads/la-tekhne/2019/PDF-La-Tekhne-No.-104-Agosto-de-2019_compressed.pdf

The experimental design applied in filamentary winding processes for composite materials industries

Julián Alberto Uribe Gómez

Composite materials are the combination of two or more materials that together provide better performance properties and functionality than each one separately. Among these materials are polymeric compounds that are composed of polymers that offer advantages over conventional materials such as: lightness, corrosion resistance, flexibility in manufacturing processes, among others
These polymers can be combined with fibers, in order to improve their properties and become composite materials, which cover a spectrum of applications ranging from structural and architectural elements for construction to applications in the aerospace industry, through the automotive industry, the construction of ships or wind power generators [1].
The filamentary winding process is one of the best known manufacturing methods with the use of glass fiber reinforced polymer composite materials, in this process, continuous reinforcements are impregnated with polymeric resin at high speeds and precisely on a mandrel or mold that rotates around its axis of rotation [1]. This process is highly automated and controlled, less labor intensive than other processes by molding. This produces laminates with high strength-to-weight ratios and can be used to produce parts with high mechanical demands [2]. The structures that can be manufactured are of revolution, with cylindrical, spherical, conical symmetry or with geodesic shapes [2] such as pipes, tanks and posts. This method has two different manufacturing conditions according to the process capacity of the company and the objective of the parts to be manufactured, these forms are continuous and discontinuous, where the difference is based on the possible forms of winding.
Figure 1 shows the basic scheme of the batch filamentary winding process, Figure 2 shows the process continuously.
Approach to experimental design and data collection
The experimental design approach starts with the identification of the variables that influence gel time. This is the time it takes for the resin to move from a liquid to a solid state and the piece is shaped, becoming practically the working time it takes to form the product. Table 1 shows the factors and factor levels that directly influence the performance and functionality of a piece manufactured by filamentary winding.
Therefore, the main objectives required in the composite materials industry are: to evaluate the effects of the factors on the response variable and to propose a mathematical model to predict the behavior of the polymeric resin in the manufacturing process.
Development of the experimental design
The experiment was proposed as a 3 ^ k Box-Behnken design with 4 factors and 3 factor levels. For this design, 81 combinations of experiments were proposed, however, due to experimental limitations, such as the time, available resources and viability of some combinations, the design was optimized to obtain 16 experiments. Based on that, the effects on the response variable were calculated, as shown in Table 2.
On the other hand, table 3 presents the analysis of variance or ANOVA table, where you can see which factors have a statistically significant effect on gel time, in this case, 5 factors have P-values ​​less than 0.05, these They are the main individual design factors.
Finally, the experimental design must be an input for decision making in industry and manufacturing, so it is important to predict gel time values, depending on the different stakeholders. In this aspect, there are two scenarios (see table 4), where it is requested to predict the optimal points for a time of 20 minutes and another of 40 minutes. In this case, R ^ 2 = 93,747 obtained in the experimental design explains how tight the model is and its reliability.
Conclusions
According to the results obtained, the factor that has a greater positive effect on the gel time of the polymeric resin is the amount of inhibitor used in manufacturing and, conversely, the other factors have negative effects on the response variable.
It is important to take into account the interactions according to the combination of the levels of the factors that are random, but the best or the most complicated options of each of them can coincide, which in practice would not be performed.

La innovación en Antioquia estudiada mediante la simulación basada en agentes


Innovation in Antioquia studied through agent-based simulation

Julián Alberto Uribe Gómez

When talking about the concept of innovation in a region, as in the case of the department of Antioquia, without a doubt, it should be directed to a more widely accepted and widespread concept such as Regional Innovation Systems (for its acronym RIS). Thus, to talk about SRIs, it must initially be defined as the infrastructure that supports innovation in the productive structure of the region, which is mainly formed by a network of relationships between the different entities or agents of public and private nature that interact in the region of Antioquia, with the objective of working from their different capacities to promote innovation.
The general structure of an SRI is defined as presented in Figure 1, where the bidirectional relationship of the 4 main entities that compose it is observed:
• Explorers: Universities and research groups.
• Exploiters: SMEs and large companies.
• Catalysts: Technology support and development centers, transfer facilitators.
• Government: Policy makers.
The theoretical development of RISs has been influenced by different schools of thought such as, for example, the school of evolutionary economics, institutional economics, new regional economies, learning economics, innovation economics and network theory (Quintero & Robledo, 2013)
Historically, the RIS of Antioquia has been developing for more than two decades after the implementation of local initiatives considering the key agents of the process as the basis of its construction. Already in the eighties, Antioquia had great strengths and a certain structure of science and technology in the academic, productive and public sectors and, for those years, it was raised as a challenge to develop a policy of science, technology and innovation (CTI) that It will revolve around the interaction between the agents.
In the 1990s, with the change in Colombia's political constitution, some attributions and functions were granted to the regions to take autonomy in decisions, to promote the development of capacities and institutions, as well as a basic infrastructure for a system of Science and innovation
In the last decade the university-company-state committee is created and linked to the regional competitiveness councils and the CTI departmental council. This makes Antioquia achieve an important development in terms of innovation among the different agents of the RIS (Llisterri & Pietrobelli, 2011)
From all of the aforementioned, the continuous and relational behavior of the RIS must then be understood, conceptualizing the system as a complex network, this implies the use of computational tools to simulate their different innovation dynamics, therefore, it has been used the NETLOGO platform to study these continuous structures and the various scenarios.
Mainly, the model (see figure 2) has two input variables that are the percentage of R&D and the number of agents (Companies, Universities, Transfer Centers and Policy Generators) in the system, and as output variables , the model has the number of scientific publications and patents generated over a period of time, as indicators of the generation of innovation activities within the region.
According to the simulations carried out with the model (see figure 3), it has been concluded that the greatest generation of innovation indicators occur in the Medellín area, since most of the agents immersed in the system converge there. Other agents to a lesser extent such as SMEs are outside the Medellin area, where some of them do not participate as active actors in the innovation process.
In addition to this, incremental results in the indicators are due to directly proportional relationships to greater clusters and interrelationships among agents, as well as to a greater number of explorers participating in the system and to a greater extent greater percentages of R&D in the region.

Aplicabilidad del diseño experimental en la agroindustria


Métodos de búsqueda de conocimiento científico


METHODS OF SCIENTIFIC KNOWLEDGE SEARCH

Julián Alberto Uribe Gómez

Scientific research has a preponderant role in the sciences, since this is the one that allows theorizing and achieving results of exploration, due to the systematic, systematic and experimental study of phenomena, through the application of the scientific method.
The scientific method has been, therefore, a methodology, as well as, an indispensable tool in research, used in multiple fields of knowledge and in the improvement of reality [1], by diverse personalities throughout history.
The history of science and technology has witnessed many scientific milestones related to various types of knowledge search methods, since not always the way to reach the result follows the same route or is in the same way, however, What is certain is that the explicit or tacit basis will always be the scientific method and its stages: problematic, search, collection and analysis of information, hypothesis and verification [2].
From this, several methods of knowledge search are classified, where historical personalities celebrated in different branches of science are presented with their scientific objective as an example to said method.
The examples and each one of the methods previously numbered, are a sample of the way as before a problem, this can be approached from different perspectives. Science and its history is versatile and full of debates about it, its continuous evolution has allowed us to question developments and postulate more adjusted principles which have generated transformative active knowledge [3].
Ideas such as mechanism and determinism in past epochs allowed solid scientific foundations to be put in place to put the sciences into operation, however challenging these assumptions through other theories such as the principle of causality, the principle of sufficient reason, the principle of indeterminacy, chance and unpredictability among others, have generated substantial debates for the progress and good of science and technology.



La adopción tecnológica: entre dos paradigmas de simulación


Technological adoption: between two simulation paradigms

Julián Alberto Uribe Gómez

Technological adoption is perhaps one of the best known and widely researched social phenomena in the academic world, for the study of cycles of acceptance and dissemination of new products or services. A known example is the diffusion model of Bass, which seeks to study the behavior between innovators and imitators. This model was developed in 1969 [1] and to date it is still valid and widely applied for the empirical study of the diffusion models of new technologies and innovations in marketing, strategy, technological administration, among others [2].

The principle by which Bass's model is governed is based on the existence of a system with two possible states: potential adopters and adopters, where the group of potential adopters pass to the group of adopters when they acquire through the purchase or use of a product or innovative service technology for the first time [3].

The Bass model uses three main parameters: potential market (potential adopters), contact coefficient, and adoption coefficient. This has led to mathematical formulations that come from physics and biology to model the phenomenon as follows [3]:

Analytically, the model offers a predictive panorama of the behavior when generating numerical outputs of the phenomenon, likewise, with this the variables and their relationships can be represented, but it is necessary to repeat the solution several times to be able to trace the trajectories of the adoption phenomenon. However, equational models have found support in computational and simulation models to represent phenomena, not only analytically but also descriptively, especially under two paradigms: system dynamics and agent-based models.

Each simulation paradigm provides support to understand the behavior of social and technological phenomena from two perspectives: a strategic and a tactical one. In this way, these methodologies can be found that help the analyst to model these situations according to his research, academic or professional need.

To model the diffusion phenomenon of Bass through system dynamics, the causal diagrams are used to represent the multiple relationships between variables. To simulate their descriptive behavior, flow and level diagrams are used. With the objective of creating the model, the web platform https://insightmaker.com/ was used, which is an open access platform that allows you to quickly create models under the system dynamics paradigm, which can be seen in Figure 1.

To represent the adoption model through the agent-based simulation paradigm, unlike system dynamics, several aspects need to be taken into account:
1. Generate rules of behavior to the entities to simulate to obtain macro behaviors.
2. Define the interaction environment of the agents.
3. Schedule the simulation and its states.
In this case, the NETLOGO platform https://ccl.northwestern.edu/netlogo/ was used for modeling, which is free to use and designed for the study of this paradigm and was used to generate the proposed model, which was You can see in figure 2.
Simulating technological adoption through the two built models can show similar behaviors in their results, however, the difference between the behaviors is due to the fact that the agent-based simulation models are mainly of discrete order, while the dynamic models of systems are continuous. As can be seen in Figures 3 and 4, both generate the well-known "S" curves of Bass's analytical model, giving the analytical theory the descriptive component of their behavior.
Now, both simulation paradigms turn out to be descriptors of the phenomenon, however, in agent-based modeling the interaction of agents during the phenomenon can be observed, as can be seen in Figure 5, and manipulate the environment in which agents find to build various adoption scenarios.
In conclusion:
Simulation paradigms can be very useful tools when describing and studying the behavior of phenomena, which are sometimes complex. These paradigms have several levels of abstraction to represent situations according to the need for analysis, thus, in system dynamics it is necessary to understand at a macro, strategic and high level of abstraction, while agent-based modeling does not only support a high level of abstraction, but you can also interact directly with the entities playing with their rules of behavior and with multiple scenarios.

Simulado y modelación basada en agentes: Un mundo por explorar


Simulated and agent-based modeling: a world to explore

Julián Alberto Uribe Gómez

When you hear about the word "agent" what do you think? What does it evoke? Sometimes with the renowned Oscar-winning 4-movie “The Matrix” and his trilogy, very famous for his antagonist agent “Smith”. It was there that they began to talk about this concept in a fairly common way, however, quite apart from the fictional concept that is handled there is an academic concept that has been exploited for decades by researchers and scientists.
It can be said that the concept of “agent” was born in the 60s, under the influence of the LOGO programming language. This language and its platform had as its main objective the education and teaching of programming (Pea, 2007), apparently in a very didactic way for the time. At that time "agents" were not programmed but "turtles" were programmed, and with simple commands or controls the "turtle" executed a series of orders and movements.
This resulted in much more powerful applications, one of them is currently the NETLOGO platform. In this platform the concept of initial “turtle” is still being implemented, however, being an academically accepted tool for programming events of individual entities, the “turtle” changes to the name of “agent”.
NETLOGO as an educational and procedural platform, was created with the same educational principle as the LOGO, where children and adults can learn equally and no previous programming bases are required, in addition the program is an open-source platform and is available on the link https://ccl.northwestern.edu/netlogo/.
Platforms such as NETLOGO over the years have incorporated various tools that enhance their usefulness in different areas of knowledge, platforms like this are known as multiparadigma tools, because they allow to explore phenomena that include biology, physics, chemistry, psychology, networks, computer science, economics and others, under aspects such as educational, procedural and simulation, and enter what is currently known as studies of emerging phenomena or behaviors.
Under this perspective, studies of complex phenomena, modeling and simulation of them have begun to gain great importance. Where the latter presents an option to improve decision making and reduce response times to situations of conflict and uncertainty (Viveros & Chew, 2013), likewise, it has been shown that traditional and analytical mathematics finds difficulties in establishing relationships and Solutions in the short term. Some examples where simulation and “agents” have been used are models of innovation systems, diffusion and adoption of technology, social networks, epidemics and viruses, behaviors of insect colonies and land traffic, among others.
With this in mind, then what is an "agent"? An "agent" is a heterogeneous object or entity, with a set of states or rules, that exhibits pre-programmed behavior to perform specific tasks in a given environment. However, an agent is programmed to be autonomous, reliable and learn (Foner, n.d.) when in interaction with other agents in the system. As examples of agents we have: ants, people, cars, companies, computers, birds.
A characteristic example to represent agents and study emergent behavior can be represented in the following situation of medical application: Consider a tissue that is being affected by a virus. It reproduces very quickly and spreads through the tissue without allowing recovery. In this case the agents represented in the system are: the tissue and the number of viruses found, with the following behaviors:
Figure 1 represents the initial phase of the preparation of the simulation of the tissue-virus system, the points are shown as the viral agents which are on the representation of the tissue.
To start the simulation there are two important moments: the first one is the preparation of the simulation environment, the second moment involves entering the simulation phase, in which the orders to the tissue and virus agents are executed, resulting in the following dynamics represented in figures 2 and 3.
In Figure 4, after 500 simulation runs, the behavior of the system can be seen. It is appreciated that at the beginning of the graph there is a bacterial and inner tissue growth due to the consumption of the outer tissue. This favors the reproduction of the virus, while the outer tissue decreases. The system reaches a moment of equilibrium between the agents.
If this system in equilibrium is injected with the effect of an enzyme that functions as an antibody, one can study how the system changes and its behavior. Therefore, another agent called enzyme will be defined with the rule: vaccinate-tissue, so we get the result represented in Figure 5.
With the injection of the enzyme into the system the viral agent is attacked by decreasing it and recovering the affected tissue.
The simulation presents benefits to all areas of science, since it allows developing experiments with minimal risks and anticipating positive or negative behaviors, all this combined with the paradigm of agent-based models allows to explore and understand global phenomena arising from individual behaviors.