A research infrastructure for the social sciences and humanities
At Cortext, our goal is to empower researchers in the social sciences and humanities by promoting advanced qualitative-quantitative mixed methods. Our primary focus is on studies about the dynamics of science, technology and innovation, and about the roles of knowledge and expertise in societies.
We understand the move towards digital humanities and computational methods not as addressing a technological gap for the social sciences, but rather as entailing entirely new assemblages between its disciplines and those of modern statistics and computer sciences. We work to tackle ever more complex research problems and deal with the profusion of new and diverse sources of information without losing sight of the situatedness and reflexivity required of studies of human societies.
Cortext is hosted by the LISIS research unit at Gustave Eiffel University, and was launched by French institutes IFRIS and INRAE, receiving their continued support.
Cortext Manager
Cortext Manager is our current main attraction, a publicly available web service providing data analysis methods curated and developed by our team of researchers and engineers.
You upload a textual corpus in order to analyse its discourse, names, categories, citations, places, dates etc, with methods for science/controversy/issue mapping, distant reading, document clustering, geo-spatial and network visualizations, and more.
You can jump straight to Cortext Manager and create an account, but we strongly suggest taking a look at the Documentation and Tutorials as you start your journey.
@online{Bento2018,
title = {Legitimation and Guidance in Energy Technology Upscaling – The Case of Floating Offshore Wind},
author = {Nuno Bento and Margarida Fontes},
url = {http://documents.manchester.ac.uk/display.aspx?DocID=37431},
year = {2018},
date = {2018-04-02},
abstract = {This research studies the role of the formation of collective visions and plans in accelerating the upscaling of emerging low-carbon innovations. We analyze the national roadmaps that have been developed for offshore wind energy in deepwaters, i.e., more than 50 meters deep where there is high potential of resources but whose technology is still immature. The analysis focus on how actors create legitimacy and guidance to prepare the growth of the system. The results points to different types of guidance depending on the technological and institutional context, particularly a higher external openness with technology maturity and government involvement. A survey of actors’ opinion complements the roadmaps analysis revealing the tendency for overinflatingexpectations. In addition, it suggestsroadmaps have a positive but limited impact on technology development. Policy implications include recommendations for managing the process of formation of visions and legitimacy of new technologies entering into upscaling.},
note = {see published article : https://doi.org/10.1016/j.rser.2018.09.035},
keywords = {},
pubstate = {published},
tppubtype = {online}
}
This research studies the role of the formation of collective visions and plans in accelerating the upscaling of emerging low-carbon innovations. We analyze the national roadmaps that have been developed for offshore wind energy in deepwaters, i.e., more than 50 meters deep where there is high potential of resources but whose technology is still immature. The analysis focus on how actors create legitimacy and guidance to prepare the growth of the system. The results points to different types of guidance depending on the technological and institutional context, particularly a higher external openness with technology maturity and government involvement. A survey of actors’ opinion complements the roadmaps analysis revealing the tendency for overinflatingexpectations. In addition, it suggestsroadmaps have a positive but limited impact on technology development. Policy implications include recommendations for managing the process of formation of visions and legitimacy of new technologies entering into upscaling.
@online{Taylor2018,
title = {Data Justice and Singapore’s Smart Nation},
author = {Linnet Taylor and Shazade Jameson and Josh Bullock and Quynh Tu Hoang and Jeroen de Vos and Maarten van Gestel and Timo Nijssen and Olivia Dziwak and Kristoffer Rekve and Yoren Lausberg and Stefany Winona Santosa and Wen Yang and Giovanni Zenga},
editor = {The Digital Methods Initiative (DMI)},
url = {https://digitalmethods.net/Dmi/SingaporeSmartNation},
year = {2018},
date = {2018-01-25},
urldate = {2018-01-25},
abstract = {We aimed to map the networks and key concepts involved in Singapore’s ‘Smart Nation’ initiative from the perspective of the Singaporean authorities, and to map and analyse the popular response to datafication.
We found that the authorities’ narrative is clear and replicated across multiple online sources. It is authored by a mixture of government and commercial actors and has strong resonance with international discourse on smart cities. It is principally hosted via Facebook and websites belonging to the government and its partners, and there is little engagement (regarding response/re-sharing) visible online from citizens.
We were able to map the official discourse quite quickly, but a widespread/critical counter-narrative was harder to find, draw out and analyse. We found that the visible critical response to the smart nation initiative revolves principally around functionality and efficiency (‘this does not work as promised’) and that there are no clearly visible public threads of discourse around rights or surveillance in relation to data. We found concerns with datafication mainly on local news sites and Reddit.
This analysis has mainly been used to help us to identify gaps and silences on the side of citizens. The social media sources with the highest penetration in Singapore carry the government narrative almost exclusively. Those with lower penetration have some responses from citizens, but in general, the public-facing component of the smart nation initiative is governmental.
Critical voices in relation to Singapore’s datafication are largely unavailable to remotely conducted digital methods. We conclude from our investigation that it is worth using digital methods to analyse the government narrative on datafication, but that researchers hoping to identify the alternative narratives should initially do so through ethnographic fieldwork and through that generate questions that are more amenable to digital methods.},
keywords = {},
pubstate = {published},
tppubtype = {online}
}
We aimed to map the networks and key concepts involved in Singapore’s ‘Smart Nation’ initiative from the perspective of the Singaporean authorities, and to map and analyse the popular response to datafication.
We found that the authorities’ narrative is clear and replicated across multiple online sources. It is authored by a mixture of government and commercial actors and has strong resonance with international discourse on smart cities. It is principally hosted via Facebook and websites belonging to the government and its partners, and there is little engagement (regarding response/re-sharing) visible online from citizens.
We were able to map the official discourse quite quickly, but a widespread/critical counter-narrative was harder to find, draw out and analyse. We found that the visible critical response to the smart nation initiative revolves principally around functionality and efficiency (‘this does not work as promised’) and that there are no clearly visible public threads of discourse around rights or surveillance in relation to data. We found concerns with datafication mainly on local news sites and Reddit.
This analysis has mainly been used to help us to identify gaps and silences on the side of citizens. The social media sources with the highest penetration in Singapore carry the government narrative almost exclusively. Those with lower penetration have some responses from citizens, but in general, the public-facing component of the smart nation initiative is governmental.
Critical voices in relation to Singapore’s datafication are largely unavailable to remotely conducted digital methods. We conclude from our investigation that it is worth using digital methods to analyse the government narrative on datafication, but that researchers hoping to identify the alternative narratives should initially do so through ethnographic fieldwork and through that generate questions that are more amenable to digital methods.
@online{Emambakhsh2018,
title = {From Hollywood to Bollywood, the rise of the #metoo movement in the Indian Twitter sphere},
author = {T. Emambakhsh and B. Da Fonseca Andreatta and C. Pan and S. Rico},
editor = {Medialab Science Po},
url = {https://fonio.medialab.sciences-po.fr/thinkdolphin/read/4004953c-4796-4a64-bbf9-962179684086?lang=en},
year = {2018},
date = {2018-01-01},
urldate = {2018-01-01},
keywords = {},
pubstate = {published},
tppubtype = {online}
}
@phdthesis{Vigni2018,
title = {Les systèmes complexes et la digitalisation des sciences. Histoire et sociologie des instituts de la complexité aux États-Unis et en France},
author = {Guido Fabrizio Li Vigni},
url = {http://www.theses.fr/2018PSLEH134},
year = {2018},
date = {2018-11-26},
urldate = {2018-11-26},
abstract = {Comment penser la relation entre les cultures scientifiques contemporaines et l’usage grandissant de l’ordinateur dans la production des savoirs ? Cette thèse se propose de donner une réponse à telle question à partir de l’analyse historique et sociologique d’un domaine scientifique fondé par le Santa Fe Institute (SFI) dans les années 1980 aux États-Unis : les « sciences des systèmes complexes » (SSC). Rendues célèbres par des publications grand-public, les SSC se répandent au cours des années 1990 et 2000 en Europe et dans d’autres pays du monde. Ce travail propose une histoire de la fondation de ce domaine en se concentrant sur le SFI et sur le Réseau National des Systèmes Complexes français. Avec un regard sociologique ancré dans les Science & Technology Studies et dans le courant pragmatiste, elle pose ensuite des questions sur le statut socio-épistémique de ce domaine, sur les modalités de l’administration de la preuve dans des savoirs fondés sur la simulation numérique et enfin sur les engagements épistémiques tenus par les spécialistes des systèmes complexes. Le matériau empirique – composé d’environ 200 entretiens, plusieurs milliers de pages d’archives et quelques visites de laboratoire – nous amène non seulement à mieux connaître ce champ de recherche – dont le langage est très répandu aujourd’hui, mais peu étudié par les historiens et les sociologues ; il nous porte aussi à questionner trois opinions courantes dans la littérature humaniste à propos des sciences numériques. À savoir : 1) l’ordinateur produit des connaissances de plus en plus interdisciplinaires, 2) il donne vie à des savoirs de type nouveau qui nécessitent une toute autre épistémologie pour être pensés et 3) il fait inévitablement advenir des visions du monde néolibérales. Or, cette thèse déconstruit ces trois formes de déterminisme technologique concernant les effets de l’ordinateur sur les pratiques scientifiques, en montrant d’abord que, dans les sciences computationnelles, les rapports interdisciplinaires ne se font pas sans effort ni pacifiquement ou sur pied d’égalité ; ensuite que les chercheurs et les chercheuses des SSC mobilisent des formes d’administration de la preuve déjà mises au point dans d’autres disciplines ; et enfin que les engagements épistémiques des scientifiques peuvent prendre une forme proche de la vision (néo)libérale, mais aussi des formes qui s’en éloignent ou qui s’y opposent.},
note = {Thèse de doctorat dirigée par Chateauraynaud, Francis Sociologie Paris Sciences et Lettres (ComUE) 2018
Comment penser la relation entre les cultures scientifiques contemporaines et l’usage grandissant de l’ordinateur dans la production des savoirs ? Cette thèse se propose de donner une réponse à telle question à partir de l’analyse historique et sociologique d’un domaine scientifique fondé par le Santa Fe Institute (SFI) dans les années 1980 aux États-Unis : les « sciences des systèmes complexes » (SSC). Rendues célèbres par des publications grand-public, les SSC se répandent au cours des années 1990 et 2000 en Europe et dans d’autres pays du monde. Ce travail propose une histoire de la fondation de ce domaine en se concentrant sur le SFI et sur le Réseau National des Systèmes Complexes français. Avec un regard sociologique ancré dans les Science & Technology Studies et dans le courant pragmatiste, elle pose ensuite des questions sur le statut socio-épistémique de ce domaine, sur les modalités de l’administration de la preuve dans des savoirs fondés sur la simulation numérique et enfin sur les engagements épistémiques tenus par les spécialistes des systèmes complexes. Le matériau empirique – composé d’environ 200 entretiens, plusieurs milliers de pages d’archives et quelques visites de laboratoire – nous amène non seulement à mieux connaître ce champ de recherche – dont le langage est très répandu aujourd’hui, mais peu étudié par les historiens et les sociologues ; il nous porte aussi à questionner trois opinions courantes dans la littérature humaniste à propos des sciences numériques. À savoir : 1) l’ordinateur produit des connaissances de plus en plus interdisciplinaires, 2) il donne vie à des savoirs de type nouveau qui nécessitent une toute autre épistémologie pour être pensés et 3) il fait inévitablement advenir des visions du monde néolibérales. Or, cette thèse déconstruit ces trois formes de déterminisme technologique concernant les effets de l’ordinateur sur les pratiques scientifiques, en montrant d’abord que, dans les sciences computationnelles, les rapports interdisciplinaires ne se font pas sans effort ni pacifiquement ou sur pied d’égalité ; ensuite que les chercheurs et les chercheuses des SSC mobilisent des formes d’administration de la preuve déjà mises au point dans d’autres disciplines ; et enfin que les engagements épistémiques des scientifiques peuvent prendre une forme proche de la vision (néo)libérale, mais aussi des formes qui s’en éloignent ou qui s’y opposent.
Last week, Ale Abdo and Joenio Costa presented at the first ever OpenAlex User Conference a short talk entitled “Analysing OpenAlex data with Cortext”, highlighting the current and [...]
On May 2024, Ale Abdo was at the University of São Paulo invited by two departments to talk about different aspects of Cortext. On the 22nd, a workshop organized with professor Gisele Craveiro [...]
Here in the outskirts of Paris, at Champs-sur-Marne, work is ongoing to build the future of Cortext. It will soon be 8 years since the second version of the open-for-all web service, Cortext [...]
Long trends on twitter: inter-temporal clusters combining hashtags and terms, for all tweets on Scientometrics, Altmetrics, Bibliometrics and Science Of Science from Jan. 2017 to dec. 2021, on a [...]
CorText Newsfeed
Want to stay up-to-date with the latest training sessions and developments in our methods and data? We invite you to subscribe to Cortext Newsfeed, our succint and researcher oriented quarterly newsletter.
Nous utilisons des cookies pour vous garantir la meilleure expérience sur notre site. Si vous continuez à utiliser ce dernier, nous considérerons que vous acceptez l'utilisation des cookies.Ok