Present, network, learn, relax...
Deindustrialization refers to the decline of the manufacturing sector, a trend observed since the late 1960s in developed countries. Some developing countries, where industrialization has been recent and even incomplete, are facing what is known as "premature deindustrialization".
The rise and fall of manufacturing has been far from smooth. The sector has undergone continuous change, driven by technological advancements and evolving consumer preferences. The most recent shift involves outsourcing specific tasks and moving industrial activities, either partially or entirely, to emerging or developing countries, resulting in a global and diverse network of suppliers.
To assess whether these changes have weakened or merely transformed manufacturing, I suggest analyzing input-output data from 76 countries between 1995 and 2020, aiming to clarify the global reality of deindustrialization.
Digital transformation is reshaping Europe's economic and social fabric. This study investigates the Digital Economy and Society Index (DESI)—which tracks Connectivity, Human Capital, Internet Services, Digital Technology Integration, and Digital Public Services—to reveal hidden patterns in EU digital progress from 2016 to 2020.
Through an innovative cluster analysis approach, we uncover the dynamic landscape of digital convergence across Member States, shedding light on both advances and the persistent digital divide. The analysis reveals that while all regions register notable digital improvements over the period, the pace and scale of progress remain markedly heterogeneous.
Scandinavian nations emerge as the leaders of digital innovation, setting benchmarks in technological leadership, whereas Eastern and Southern countries continue to face significant challenges.
Seventy‑five years after the phrase Artificial Intelligence was coined and fifty years after Microsoft was founded, the IT “menu” tells a layered story. In the 1950s hulking mainframes filled whole floors, handling “a comparatively small amount of data” in Fortran or COBOL for accounting departments. Networking in the 1980s and the IBM PC (1981) opened applications for “multi‑faceted customers” such as production plants and inventory managers. Marketing‑science pioneer John Little drove the computer into decision science, while pop culture adopted monikers like “Big Blue,” the mis‑attributed boast “640K ought to be enough for anybody,” and IBM’s terse exhortation “Think.”
Across the facing page, AI blossomed in research. Turing’s Imitation game (1950) set the benchmark; Christopher Strachey’s 1951 chess program recorded the first win; the Perceptron (1957) launched the “Modern Neural Network;” ELIZA (1965) opened Natural Language Processing; and the General Problem Solver (1969) displayed heuristic search. The “fruit of this labor” is clear today as Waymo taxis glide through San Francisco, while since 2022 a wave of Large Language Models—ChatGPT, Gemini, and others—has seeped into phones, factory dashboards, and even fast‑food ordering systems.
Herbert Simon once predicted AI would “take over” within ten years; forty years later that promise finally joins the menu. Data Scientists now steer progress, affirming Peter L. Bernstein’s warning: “The information you have is not the information you want… The information you can obtain costs more than you want to pay.” Yesterday’s flavors—Expert Systems, Knowledge Management, Decision Support, Business Intelligence, Big Data—have blended into AI. Room‑sized cabinets shrank to pockets, wired became wireless, data lives in the Cloud, and LLMs are edging toward replacing Google Search. Since Deep Blue defeated Garry Kasparov in 1997, one question endures: What will be on the IT menu in 2033 or 2053, and which path leads there?
Applying artificial intelligence (AI) in statistical analysis requires a solid foundation in statistical knowledge, tools, and analytical practices within enterprises. This study presents the results of a survey conducted among Croatian enterprises to assess whether they possess the necessary background to adopt AI-supported approaches in their statistical workflows. The survey gathered data on the current use of statistical methods, the types of software applied, the frequency and purposes of statistical analyses, and the statistical education of employees involved in data-related tasks.
Findings indicate that while many enterprises report some level of engagement with statistical methods, this is often limited to fundamental descriptive analysis and operational reporting, most frequently using spreadsheet tools. The use of advanced statistical software and methods is relatively rare. Additionally, formal statistical training among staff appears limited, with few enterprises offering structured education or ongoing capacity development.
These results point to significant gaps in statistical readiness that could hinder the effective implementation of AI in enterprise settings. The analysis highlights sectoral and size-based differences and suggests that improving statistical literacy, tool adoption, and organizational support structures is essential for building the analytical foundations for responsible and efficient AI integration.