Results for 'Data compression'

981 found
Order:
  1. Intuition, intelligence, data compression.Jens Kipper - 2019 - Synthese 198 (Suppl 27):6469-6489.
    The main goal of my paper is to argue that data compression is a necessary condition for intelligence. One key motivation for this proposal stems from a paradox about intuition and intelligence. For the purposes of this paper, it will be useful to consider playing board games—such as chess and Go—as a paradigm of problem solving and cognition, and computer programs as a model of human cognition. I first describe the basic components of computer programs that play board (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  2.  13
    Efficient data compression in perception and perceptual memory.Christopher J. Bates & Robert A. Jacobs - 2020 - Psychological Review 127 (5):891-917.
    No categories
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   5 citations  
  3.  17
    Data compression using an intelligent generator: The storage of chess games as an example.Ingo Althöfer - 1991 - Artificial Intelligence 52 (1):109-113.
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  4.  22
    Chunking and data compression in verbal short-term memory.Dennis Norris & Kristjan Kalm - 2021 - Cognition 208 (C):104534.
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  5.  35
    Vehicle Text Data Compression and Transmission Method Based on Maximum Entropy Neural Network and Optimized Huffman Encoding Algorithms.Jingfeng Yang, Zhenkun Zhang, Nanfeng Zhang, Ming Li, Yanwei Zheng, Li Wang, Yong Li, Ji Yang, Yifei Xiang & Yu Zhang - 2019 - Complexity 2019:1-9.
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  6.  75
    What’s magic about magic numbers? Chunking and data compression in short-term memory.Fabien Mathy & Jacob Feldman - 2012 - Cognition 122 (3):346-362.
  7.  44
    Chunking as a rational strategy for lossy data compression in visual working memory.Matthew R. Nassar, Julie C. Helmers & Michael J. Frank - 2018 - Psychological Review 125 (4):486-511.
    No categories
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   5 citations  
  8.  49
    Chunk formation in immediate memory and how it relates to data compression.Mustapha Chekaf, Nelson Cowan & Fabien Mathy - 2016 - Cognition 155 (C):96-107.
    No categories
    Direct download (8 more)  
     
    Export citation  
     
    Bookmark   9 citations  
  9.  56
    Are candle stick bars a good tool for data compression in natural science?Alfred Hübler - 2011 - Complexity 17 (1):5-8.
    No categories
    Direct download  
     
    Export citation  
     
    Bookmark   1 citation  
  10.  83
    Is symbolic dynamics the most efficient data compression tool for chaotic time series?Alfred Hubler - 2012 - Complexity 17 (3):5-7.
    Direct download  
     
    Export citation  
     
    Bookmark  
  11.  16
    Analysis and modification of graphic data compression algorithms.Bouza M. K. - 2020 - Artificial Intelligence Scientific Journal 25 (4):32-40.
    The article examines the algorithms for JPEG and JPEG-2000 compression of various graphic images. The main steps of the operation of both algorithms are given, their advantages and disadvantages are noted. The main differences between JPEG and JPEG-2000 are analyzed. It is noted that the JPEG-2000 algorithm allows re-moving visually unpleasant effects. This makes it possible to highlight important areas of the image and improve the quality of their compression. The features of each step of the algorithms are (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  12.  76
    Quick Compression and Transmission of Meteorological Big Data in Complicated Visualization Systems.He-Ping Yang, Ying-Rui Sun, Nan Chen, Xiao-Wei Jiang, Jing-Hua Chen, Ming Yang, Qi Wang, Zi-Mo Huo & Ming-Nong Feng - 2022 - Complexity 2022:1-9.
    The sizes of individual data files have steadily increased along with rising demand for customized services, leading to issues such as low efficiency of web-based geographical information system -based data compression, transmission, and rendering for rich Internet applications in complicated visualization systems. In this article, a WebGIS-based technical solution for the efficient transmission and visualization of meteorological big data is proposed. Based on open-source technology such as HTML5 and Mapbox GL, the proposed scheme considers distributed (...) compression and transmission on the server side as well as distributed requests and page rendering on the browser side. A high-low 8-bit compression method is developed for compressing a 100 megabyte file into a megabyte-scale file, with a compression ratio of approximately 90%, and the recovered data are accurate to two decimal places. Another part of the scheme combines pyramid tile cutting, concurrent domain name request processing, and texture rendering. Experimental results indicate that with this scheme, grid files of up to 100 MB can be transferred and displayed in milliseconds, and multiterminal service applications can be supported by building a grid data visualization mode for big data and technology centers, which may serve as a reference for other industries. (shrink)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  13.  44
    Algorithmic compression of empirical data: reply to Twardy, Gardner, and Dowe.James Mcallister - 2005 - Studies in History and Philosophy of Science Part A 36 (2):403-410.
    This discussion note responds to objections by Twardy, Gardner, and Dowe to my earlier claim that empirical data sets are algorithmically incompressible. Twardy, Gardner, and Dowe hold that many empirical data sets are compressible by Minimum Message Length technique and offer this as evidence that these data sets are algorithmically compressible. I reply that the compression achieved by Minimum Message Length technique is different from algorithmic compression. I conclude that Twardy, Gardner, and Dowe fail to (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  14.  41
    Empirical data sets are algorithmically compressible: Reply to McAllister.Charles Twardy, Steve Gardner & David L. Dowe - 2005 - Studies in the History and Philosophy of Science, Part A 36 (2):391-402.
    James McAllister’s 2003 article, “Algorithmic randomness in empirical data” claims that empirical data sets are algorithmically random, and hence incompressible. We show that this claim is mistaken. We present theoretical arguments and empirical evidence for compressibility, and discuss the matter in the framework of Minimum Message Length (MML) inference, which shows that the theory which best compresses the data is the one with highest posterior probability, and the best explanation of the data.
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   7 citations  
  15. Empirical data sets are algorithmically compressible: reply to McAllister?Charles Twardy, Steve Gardner & David L. Dowe - 2005 - Studies in History and Philosophy of Science Part A 36 (2):391-402.
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  16. Compression mode: the edge of sensibility.Stephen Kennedy - 2025 - New York: Bloomsbury Academic.
    This book examines how compression can be understood not only as a digital process enacted through computing, but as an economic and political phenomenon that impacts the ecology of waste, diversity and social inclusivity. Beginning with a linguistic underpinning of visual space, the book examines the development of the MP3 algorithm and the 'waste' it creates, challenging the wisdom that human reason and language is uniquely capable of bringing order to chaos. Returning to the idea of a sonic economy, (...)
     
    Export citation  
     
    Bookmark  
  17.  12
    Non-linear data stream compression: foundations and theoretical results.Alfredo Cuzzocrea & Hendrik Decker - 2012 - In Emilio Corchado, Vaclav Snasel, Ajith Abraham, Michał Woźniak, Manuel Grana & Sung-Bae Cho (eds.), Hybrid Artificial Intelligent Systems. Springer. pp. 622--634.
  18.  13
    Image Compression Based on Block SVD Power Method.Khalid El Asnaoui - 2019 - Journal of Intelligent Systems 29 (1):1345-1359.
    In recent years, the important and fast growth in the development and demand of multimedia products is contributing to an insufficiency in the bandwidth of devices and network storage memory. Consequently, the theory of data compression becomes more significant for reducing data redundancy in order to allow more transfer and storage of data. In this context, this paper addresses the problem of lossy image compression. Indeed, this new proposed method is based on the block singular (...)
    No categories
    Direct download  
     
    Export citation  
     
    Bookmark  
  19. Compressibility and the Reality of Patterns.Tyler Millhouse - 2021 - Philosophy of Science 88 (1):22-43.
    Daniel Dennett distinguishes real patterns from bogus patterns by appeal to compressibility. As information theorists have shown, data are compressible if and only if those data exhibit a pattern....
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   7 citations  
  20. Compressed Sensing - A New mode of Measurement.Thomas Vogt - 2016 - In Nicola Mößner & Alfred Nordmann (eds.), Reasoning in Measurement. New York: Routledge.
    After introducing the concept of compressed sensing as a complementary measurement mode to the classical Shannon-Nyquist approach, I discuss some of the drivers, potential challenges and obstacles to its implementation. I end with a speculative attempt to embed compressed sensing as an enabling methodology within the emergence of data-driven discovery. As a consequence I predict the growth of non-nomological sciences where heuristic correlations will find applications but often bypass conventional pure basic and use-inspired basic research stages due to the (...)
    Direct download  
     
    Export citation  
     
    Bookmark  
  21. Understanding as compression.Daniel A. Wilkenfeld - 2019 - Philosophical Studies 176 (10):2807-2831.
    What is understanding? My goal in this paper is to lay out a new approach to this question and clarify how that approach deals with certain issues. The claim is that understanding is a matter of compressing information about the understood so that it can be mentally useful. On this account, understanding amounts to having a representational kernel and the ability to use it to generate the information one needs regarding the target phenomenon. I argue that this ambitious new account (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   22 citations  
  22.  18
    Parallel Fractal Compression Method for Big Video Data.Shuai Liu, Weiling Bai, Gaocheng Liu, Wenhui Li & Hari M. Srivastava - 2018 - Complexity 2018:1-16.
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  23.  73
    Compressibility, Laws of Nature, Initial Conditions and Complexity.Sergio Chibbaro & Angelo Vulpiani - 2017 - Foundations of Physics 47 (10):1368-1386.
    We critically analyse the point of view for which laws of nature are just a mean to compress data. Discussing some basic notions of dynamical systems and information theory, we show that the idea that the analysis of large amount of data by means of an algorithm of compression is equivalent to the knowledge one can have from scientific laws, is rather naive. In particular we discuss the subtle conceptual topic of the initial conditions of phenomena which (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  24.  26
    Compressive Strength Prediction Using Coupled Deep Learning Model with Extreme Gradient Boosting Algorithm: Environmentally Friendly Concrete Incorporating Recycled Aggregate.Mayadah W. Falah, Sadaam Hadee Hussein, Mohammed Ayad Saad, Zainab Hasan Ali, Tan Huy Tran, Rania M. Ghoniem & Ahmed A. Ewees - 2022 - Complexity 2022:1-22.
    The application of recycled aggregate as a sustainable material in construction projects is considered a promising approach to decrease the carbon footprint of concrete structures. Prediction of compressive strength of environmentally friendly concrete containing recycled aggregate is important for understanding sustainable structures’ concrete behaviour. In this research, the capability of the deep learning neural network approach is examined on the simulation of CS of EF concrete. The developed approach is compared to the well-known artificial intelligence approaches named multivariate adaptive regression (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  25.  30
    Compressibility and the Algorithmic Theory of Laws.Billy Wheeler - 2019 - Principia: An International Journal of Epistemology 23 (3):461-485.
    The algorithmic theory of laws claims that the laws of nature are the algorithms in the best possible compression of all empirical data. This position assumes that the universe is compressible and that data received from observing it is easily reproducible using a simple set of rules. However, there are three sources of evidence that suggest that the universe as a whole is incompressible. The first comes from the practice of science. The other two come from the (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  26.  12
    Biometric recognition system performance measures for lossy compression on EEG signals.Binh Nguyen, Wanli Ma & Dat Tran - forthcoming - Logic Journal of the IGPL.
    Electroencephalogram plays an essential role in analysing and recognizing brain-related diseases. EEG has been increasingly used as a new type of biometrics in person identification and verification systems. These EEG-based systems are important components in applications for both police and civilian works, and both areas process a huge amount of EEG data. Storing and transmitting these huge amounts of data are significant challenges for data compression techniques. Lossy compression is used for EEG data as (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  27.  17
    Compressed Sensing for THz FMCW Radar 3D Imaging.Shanshan Gu, Guangrong Xi, Lingyu Ge, Zhong Yang, Yizhi Wang, Weina Chen & Zhenzhong Yu - 2021 - Complexity 2021:1-10.
    A terahertz frequency-modulated continuous wave imaging radar system is developed for high-resolution 3D imaging recently. Aiming at the problems of long data acquisition periods and large sample sizes for the developed imaging system, an algorithm based on compressed sensing is proposed for THz FMCW radar 3D imaging in this paper. Firstly, the FMCW radar signal model is built, and the conventional range migration algorithm is introduced for THz FMCW radar imaging. Then, compressed sensing is extended for THz FMCW radar (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  28.  27
    On the application of compression-based metrics to identifying anomalous behaviour in web traffic.Gonzalo de la Torre-Abaitua, Luis F. Lago-Fernández & David Arroyo - 2020 - Logic Journal of the IGPL 28 (4):546-557.
    In cybersecurity, there is a call for adaptive, accurate and efficient procedures to identifying performance shortcomings and security breaches. The increasing complexity of both Internet services and traffic determines a scenario that in many cases impedes the proper deployment of intrusion detection and prevention systems. Although it is a common practice to monitor network and applications activity, there is not a general methodology to codify and interpret the recorded events. Moreover, this lack of methodology somehow erodes the possibility of diagnosing (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  29.  20
    Soft computing based compressive sensing techniques in signal processing: A comprehensive review.Sanjay Jain & Ishani Mishra - 2020 - Journal of Intelligent Systems 30 (1):312-326.
    In this modern world, a massive amount of data is processed and broadcasted daily. This includes the use of high energy, massive use of memory space, and increased power use. In a few applications, for example, image processing, signal processing, and possession of data signals, etc., the signals included can be viewed as light in a few spaces. The compressive sensing theory could be an appropriate contender to manage these limitations. “Compressive Sensing theory” preserves extremely helpful while signals (...)
    No categories
    Direct download  
     
    Export citation  
     
    Bookmark   1 citation  
  30.  20
    Application of Normalized Compression Distance and Lempel-Ziv Jaccard Distance in Micro-electrode Signal Stream Classification for the Surgical Treatment of Parkinson’s Disease.Kamil Ząbkiewicz - 2018 - Studies in Logic, Grammar and Rhetoric 56 (1):45-57.
    Parkinson’s Disease can be treated with the use of microelectrode recording and stimulation. This paper presents a data stream classifier that analyses raw data from micro-electrodes and decides whether the measurements were taken from the subthalamic nucleus (STN) or not. The novelty of the proposed approach is based on the fact that distances based on raw data are used. Two distances are investigated in this paper, i.e. Normalized Compression Distance (NCD) and Lempel-Ziv Jaccard Distance (LZJD). No (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  31.  19
    HWCD: A hybrid approach for image compression using wavelet, encryption using confusion, and decryption using diffusion scheme.Alagarswamy Ramaprasath & Heggere Rangaswamaiah Latha - 2023 - Journal of Intelligent Systems 32 (1).
    Image data play important role in various real-time online and offline applications. Biomedical field has adopted the imaging system to detect, diagnose, and prevent several types of diseases and abnormalities. The biomedical imaging data contain huge information which requires huge storage space. Moreover, currently telemedicine and IoT based remote health monitoring systems are widely developed where data is transmitted from one place to another. Transmission of this type of huge data consumes more bandwidth. Along with this, (...)
    Direct download  
     
    Export citation  
     
    Bookmark  
  32.  42
    Data Hiding Based on Improved Exploiting Modification Direction Method and Huffman Coding.Tanzila Saba, Mohammed Hazim Alkawaz, Amjad Rehman, Ghazali Sulong & Ali M. Ahmad - 2014 - Journal of Intelligent Systems 23 (4):451-459.
    The rapid growth of covert activities via communications network brought about an increasing need to provide an efficient method for data hiding to protect secret information from malicious attacks. One of the options is to combine two approaches, namely steganography and compression. However, its performance heavily relies on three major factors, payload, imperceptibility, and robustness, which are always in trade-offs. Thus, this study aims to hide a large amount of secret message inside a grayscale host image without sacrificing (...)
    No categories
    Direct download  
     
    Export citation  
     
    Bookmark  
  33. Simplicity tracks truth because compression tracks probability.Steve Petersen - manuscript
    The simplicity of a theory seems closely related to how well the theory summarizes individual data points. Think, for example, of classic curve-fitting. It is easy to get perfect data-fit with a ‘‘theory’’ that simply lists each point of data, but such a theory is maximally unsimple (for the data-fit). The simple theory suggests instead that there is one underlying curve that summarizes this data, and we usually prefer such a theory even at some expense (...)
     
    Export citation  
     
    Bookmark  
  34.  97
    Algorithmic randomness in empirical data.James W. McAllister - 2003 - Studies in History and Philosophy of Science Part A 34 (3):633-646.
    According to a traditional view, scientific laws and theories constitute algorithmic compressions of empirical data sets collected from observations and measurements. This article defends the thesis that, to the contrary, empirical data sets are algorithmically incompressible. The reason is that individual data points are determined partly by perturbations, or causal factors that cannot be reduced to any pattern. If empirical data sets are incompressible, then they exhibit maximal algorithmic complexity, maximal entropy and zero redundancy. They are (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   13 citations  
  35.  85
    Fault Diagnosis for Hydraulic Servo System Using Compressed Random Subspace Based ReliefF.Yu Ding, Fei Wang, Zhen-ya Wang & Wen-jin Zhang - 2018 - Complexity 2018:1-14.
    Playing an important role in electromechanical systems, hydraulic servo system is crucial to mechanical systems like engineering machinery, metallurgical machinery, ships, and other equipment. Fault diagnosis based on monitoring and sensory signals plays an important role in avoiding catastrophic accidents and enormous economic losses. This study presents a fault diagnosis scheme for hydraulic servo system using compressed random subspace based ReliefF method. From the point of view of feature selection, the scheme utilizes CRSR method to determine the most stable feature (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  36.  39
    A Phenomenological Study of Ginger Compress Therapy for People with Osteoarthritis.Tessa Therkleson - 2010 - Indo-Pacific Journal of Phenomenology 10 (1):1-10.
    This paper claims rigour and sensitivity for a methodology used to explore multiple sources of data and expose the essential characteristics of a phenomenon in the human sciences. A descriptive phenomenological methodology was applied in a study of the experience of ten people with osteoarthritis receiving ginger compress therapy. The application of the phenomenological attitude, with reduction, bracketing and imaginative variation, allowed multiple sources of data – written, pictorial and oral – to be explicated. The applied methodology used (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  37.  26
    Now or … later: Perceptual data are not immediately forgotten during language processing.Klinton Bicknell, T. Florian Jaeger & Michael K. Tanenhaus - 2016 - Behavioral and Brain Sciences 39.
    Christiansen & Chater propose that language comprehenders must immediately compress perceptual data by “chunking” them into higher-level categories. Effective language understanding, however, requires maintaining perceptual information long enough to integrate it with downstream cues. Indeed, recent results suggest comprehenders do this. Although cognitive systems are undoubtedly limited, frameworks that do not take into account the tasks that these systems evolved to solve risk missing important insights.
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  38.  15
    Construction of Social Security Fund Cloud Audit Platform Based on Fuzzy Data Mining Algorithm.Yangting Huai & Qianxiao Zhang - 2021 - Complexity 2021:1-11.
    Guided by the theories of system theory, synergetic theory, and other disciplines and based on fuzzy data mining algorithm, this article constructs a three-tier social security fund cloud audit platform. Firstly, the article systematically expounds the current situation of social security fund and social security fund audit, such as the technical basis of cloud computing and data mining. Combined with the actual work, the necessity and feasibility of building a cloud audit platform for social security funds are analyzed. (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  39. Humeanism and Exceptions in the Fundamental Laws of Physics.Billy Wheeler - 2017 - Principia: An International Journal of Epistemology 21 (3):317-337.
    It has been argued that the fundamental laws of physics do not face a ‘problem of provisos’ equivalent to that found in other scientific disciplines (Earman, Roberts and Smith 2002) and there is only the appearance of exceptions to physical laws if they are confused with differential equations of evolution type (Smith 2002). In this paper I argue that even if this is true, fundamental laws in physics still pose a major challenge to standard Humean approaches to lawhood, as they (...)
    Direct download (7 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  40.  23
    The origins of larval forms: what the data indicate, and what they don't.Alessandro Minelli - 2010 - Bioessays 32 (1):5-8.
    What is a larva, if it is not what survives of an ancestor's adult, compressed into a transient pre‐reproductive phase, as suggested by Haeckel's largely disreputed model of evolution by recapitulation? A recently published article hypothesizes that larva and adult of holometabolous insects are developmental expressions of two different genomes coexisting in the same animal as a result of an ancient hybridization event between an onychophoran and a primitive insect with eventless post‐embryonic development. More likely, however, larvae originated from late (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  41.  39
    Introducing a Method for Intervals Correction on Multiple Likert Scales: A Case Study on an Urban Soundscape Data Collection Instrument.Matteo Lionello, Francesco Aletta, Andrew Mitchell & Jian Kang - 2021 - Frontiers in Psychology 11.
    Likert scales are useful for collecting data on attitudes and perceptions from large samples of people. In particular, they have become a well-established tool in soundscape studies for conducting in situ surveys to determine how people experience urban public spaces. However, it is still unclear whether the metrics of the scales are consistently interpreted during a typical assessment task. The current work aims at identifying some general trends in the interpretation of Likert scale metrics and introducing a procedure for (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  42. Real patterns and indispensability.Abel Suñé & Manolo Martínez - 2021 - Synthese 198 (5):4315-4330.
    While scientific inquiry crucially relies on the extraction of patterns from data, we still have a far from perfect understanding of the metaphysics of patterns—and, in particular, of what makes a pattern real. In this paper we derive a criterion of real-patternhood from the notion of conditional Kolmogorov complexity. The resulting account belongs to the philosophical tradition, initiated by Dennett :27–51, 1991), that links real-patternhood to data compressibility, but is simpler and formally more perspicuous than other proposals previously (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   7 citations  
  43.  14
    Dimensional reduction in complex living systems: Where, why, and how.Jean-Pierre Eckmann & Tsvi Tlusty - 2021 - Bioessays 43 (9):2100062.
    The unprecedented prowess of measurement techniques provides a detailed, multi‐scale look into the depths of living systems. Understanding these avalanches of high‐dimensional data—by distilling underlying principles and mechanisms—necessitates dimensional reduction. We propose that living systems achieve exquisite dimensional reduction, originating from their capacity to learn, through evolution and phenotypic plasticity, the relevant aspects of a non‐random, smooth physical reality. We explain how geometric insights by mathematicians allow one to identify these genuine hallmarks of life and distinguish them from universal (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  44.  16
    Basic concepts in algorithms.Shmuel T. Klein - 2021 - Hoboken: World Scientific.
    This book is the result of several decades of teaching experience in data structures and algorithms. It is self-contained but does assume some prior knowledge of data structures, and a grasp of basic programming and mathematics tools. Basic Concepts in Algorithms focuses on more advanced paradigms and methods combining basic programming constructs as building blocks and their usefulness in the derivation of algorithms. Its coverage includes the algorithms' design process and an analysis of their performance. It is primarily (...)
    Direct download  
     
    Export citation  
     
    Bookmark  
  45.  16
    Algorithmic Probability and Friends. Bayesian Prediction and Artificial Intelligence: Papers From the Ray Solomonoff 85th Memorial Conference, Melbourne, Vic, Australia, November 30 -- December 2, 2011.David L. Dowe (ed.) - 2013 - Springer.
    Algorithmic probability and friends: Proceedings of the Ray Solomonoff 85th memorial conference is a collection of original work and surveys. The Solomonoff 85th memorial conference was held at Monash University's Clayton campus in Melbourne, Australia as a tribute to pioneer, Ray Solomonoff, honouring his various pioneering works - most particularly, his revolutionary insight in the early 1960s that the universality of Universal Turing Machines could be used for universal Bayesian prediction and artificial intelligence. This work continues to increasingly influence and (...)
    Direct download  
     
    Export citation  
     
    Bookmark  
  46.  16
    Selected papers on design of algorithms.Donald Ervin Knuth - 2010 - Stanford, Calif.: Center for the Study of Language and Information.
    Donald E. Knuth has been making foundational contributions to the field of computer science for as long as computer science has been a field. His award-winning textbooks are often given credit for shaping the field, and his scientific papers are widely referenced and stand as milestones of development over a wide variety of topics. The present volume, the seventh in a series of his collected papers, is devoted to his work on the design of new algorithms. Nearly thirty of Knuth’s (...)
    Direct download  
     
    Export citation  
     
    Bookmark  
  47.  40
    Ask Not What You Can Do for Yourself: Cartesian Chaos, Neural Dynamics, and Immunological Cognition. [REVIEW]Seán Ó Nualláin - 2010 - Biosemiotics 3 (1):79-92.
    This paper focuses on the disparate phenomena we psychologize as “selfhood”. A central argument is that, far from being a deus ex machina as required in the Cartesian schema, our felt experience of self is above all a consequence of data compression. In coming to this conclusion, it considers in turn the Cartesian epiphany, other traditional and contemporary perspectives, and a half-century’s empirical work in the Freeman lab on neurodynamics. We introduce the concept of consciousness qua process as (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  48.  9
    Using ontologies and STEP standards for the semantic simplification of CAD models in different engineering domains.Jorge Posada, Carlos Toro, Stefan Wundrak & Andre Stork - 2006 - Applied ontology 1 (3-4):263-279.
    We present in this article a compression system that uses STEP compliant standards for the compression and design review visualization of large CAD data sets. Our approach is orthogonal to the traditional techniques applied in the field, as we complement previous works by introducing semantic criteria along with algorithms for the categorization, simplification and user-oriented adaptation of the engineering components described by domain-specific standards. As an example, we have implemented two test cases in two specific domains – (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  49.  35
    Blueprints, Swiss Army knives, and other metaphors. [REVIEW]Timothy Justus - 2004 - Trends in Cognitive Sciences 8:201–203.
    In this book review essay, Justus discusses The Birth of the Mind: How a Tiny Number of Genes Creates the Complexities of Human Thought (2004) by Gary Marcus. The review opens by contrasting the common architectural-blueprint metaphor for the genome with an alternative: the if-then statements of a computer program. The former leads to a seeming “gene shortage” problem while the latter are better suited to representing the cascades of genetic expression that give rise to exponential genotype-phenotype relationships. The essay (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  50. MDLChunker: A MDL-Based Cognitive Model of Inductive Learning.Vivien Robinet, Benoît Lemaire & Mirta B. Gordon - 2011 - Cognitive Science 35 (7):1352-1389.
    This paper presents a computational model of the way humans inductively identify and aggregate concepts from the low-level stimuli they are exposed to. Based on the idea that humans tend to select the simplest structures, it implements a dynamic hierarchical chunking mechanism in which the decision whether to create a new chunk is based on an information-theoretic criterion, the Minimum Description Length (MDL) principle. We present theoretical justifications for this approach together with results of an experiment in which participants, exposed (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   11 citations  
1 — 50 / 981