Research Nebraska Logo

  • Help & FAQ

Introduction to Data Compression

  • Electrical & Computer Engineering

Research output : Book/Report › Book

Each edition of Introduction to Data Compression has widely been considered the best introduction and reference text on the art and science of data compression, and the third edition continues in this tradition. Data compression techniques and technology are ever-evolving with new applications in image, speech, text, audio, and video. The third edition includes all the cutting edge updates the reader will need during the work day and in class. Khalid Sayood provides an extensive introduction to the theory underlying today's compression techniques with detailed instruction for their applications using several examples to explain the concepts. Encompassing the entire field of data compression Introduction to Data Compression, includes lossless and lossy compression, Huffman coding, arithmetic coding, dictionary techniques, context based compression, scalar and vector quantization. Khalid Sayood provides a working knowledge of data compression, giving the reader the tools to develop a complete and concise compression package upon completion of his book. * New content added on the topic of audio compression including a description of the mp3 algorithm * New video coding standard and new facsimile standard explained * Completely explains established and emerging standards in depth including JPEG 2000, JPEG-LS, MPEG-2, Group 3 and 4 faxes, JBIG 2, ADPCM, LPC, CELP, and MELP * Source code provided via companion web site that gives readers the opportunity to build their own algorithms, choose and implement techniques in their own applications.

ASJC Scopus subject areas

  • General Computer Science

Access to Document

  • 10.1016/B978-0-12-620862-7.X5000-7

Other files and links

  • Link to publication in Scopus
  • Link to the citations in Scopus

Fingerprint

  • Data compression Engineering & Materials Science 100%
  • Facsimile Engineering & Materials Science 41%
  • Vector quantization Engineering & Materials Science 17%
  • Image coding Engineering & Materials Science 15%
  • Glossaries Engineering & Materials Science 14%
  • Websites Engineering & Materials Science 13%

T1 - Introduction to Data Compression

AU - Sayood, Khalid

N2 - Each edition of Introduction to Data Compression has widely been considered the best introduction and reference text on the art and science of data compression, and the third edition continues in this tradition. Data compression techniques and technology are ever-evolving with new applications in image, speech, text, audio, and video. The third edition includes all the cutting edge updates the reader will need during the work day and in class. Khalid Sayood provides an extensive introduction to the theory underlying today's compression techniques with detailed instruction for their applications using several examples to explain the concepts. Encompassing the entire field of data compression Introduction to Data Compression, includes lossless and lossy compression, Huffman coding, arithmetic coding, dictionary techniques, context based compression, scalar and vector quantization. Khalid Sayood provides a working knowledge of data compression, giving the reader the tools to develop a complete and concise compression package upon completion of his book. * New content added on the topic of audio compression including a description of the mp3 algorithm * New video coding standard and new facsimile standard explained * Completely explains established and emerging standards in depth including JPEG 2000, JPEG-LS, MPEG-2, Group 3 and 4 faxes, JBIG 2, ADPCM, LPC, CELP, and MELP * Source code provided via companion web site that gives readers the opportunity to build their own algorithms, choose and implement techniques in their own applications.

AB - Each edition of Introduction to Data Compression has widely been considered the best introduction and reference text on the art and science of data compression, and the third edition continues in this tradition. Data compression techniques and technology are ever-evolving with new applications in image, speech, text, audio, and video. The third edition includes all the cutting edge updates the reader will need during the work day and in class. Khalid Sayood provides an extensive introduction to the theory underlying today's compression techniques with detailed instruction for their applications using several examples to explain the concepts. Encompassing the entire field of data compression Introduction to Data Compression, includes lossless and lossy compression, Huffman coding, arithmetic coding, dictionary techniques, context based compression, scalar and vector quantization. Khalid Sayood provides a working knowledge of data compression, giving the reader the tools to develop a complete and concise compression package upon completion of his book. * New content added on the topic of audio compression including a description of the mp3 algorithm * New video coding standard and new facsimile standard explained * Completely explains established and emerging standards in depth including JPEG 2000, JPEG-LS, MPEG-2, Group 3 and 4 faxes, JBIG 2, ADPCM, LPC, CELP, and MELP * Source code provided via companion web site that gives readers the opportunity to build their own algorithms, choose and implement techniques in their own applications.

UR - http://www.scopus.com/inward/record.url?scp=85013706250&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85013706250&partnerID=8YFLogxK

U2 - 10.1016/B978-0-12-620862-7.X5000-7

DO - 10.1016/B978-0-12-620862-7.X5000-7

AN - SCOPUS:85013706250

SN - 9780126208627

BT - Introduction to Data Compression

PB - Elsevier Inc.

Lossless data compression techniques and their performance

Ieee account.

  • Change Username/Password
  • Update Address

Purchase Details

  • Payment Options
  • Order History
  • View Purchased Documents

Profile Information

  • Communications Preferences
  • Profession and Education
  • Technical Interests
  • US & Canada: +1 800 678 4333
  • Worldwide: +1 732 981 0060
  • Contact & Support
  • About IEEE Xplore
  • Accessibility
  • Terms of Use
  • Nondiscrimination Policy
  • Privacy & Opting Out of Cookies

A not-for-profit organization, IEEE is the world's largest technical professional organization dedicated to advancing technology for the benefit of humanity. © Copyright 2024 IEEE - All rights reserved. Use of this web site signifies your agreement to the terms and conditions.

data compression Recently Published Documents

Total documents.

  • Latest Documents
  • Most Cited Documents
  • Contributed Authors
  • Related Sources
  • Related Keywords

Design and development of learning model for compression and processing of deoxyribonucleic acid genome sequence

Owing to the substantial volume of human genome sequence data files (from 30-200 GB exposed) Genomic data compression has received considerable traction and storage costs are one of the major problems faced by genomics laboratories. This involves a modern technology of data compression that reduces not only the storage but also the reliability of the operation. There were few attempts to solve this problem independently of both hardware and software. A systematic analysis of associations between genes provides techniques for the recognition of operative connections among genes and their respective yields, as well as understandings into essential biological events that are most important for knowing health and disease phenotypes. This research proposes a reliable and efficient deep learning system for learning embedded projections to combine gene interactions and gene expression in prediction comparison of deep embeddings to strong baselines. In this paper we preform data processing operations and predict gene function, along with gene ontology reconstruction and predict the gene interaction. The three major steps of genomic data compression are extraction of data, storage of data, and retrieval of the data. Hence, we propose a deep learning based on computational optimization techniques which will be efficient in all the three stages of data compression.

A combination of least significant bit and deflate compression for image steganography

Steganography is one of the cryptography techniques where secret information can be hidden through multimedia files such as images and videos. Steganography can offer a way of exchanging secret and encrypted information in an untypical mechanism where communicating parties can only interpret the secret message. The literature has shown a great interest in the least significant bit (LSB) technique which aims at embedding the secret message bits into the most insignificant bits of the image pixels. Although LSB showed a stable performance of image steganography yet, many works should be done on the message part. This paper aims to propose a combination of LSB and Deflate compression algorithm for image steganography. The proposed Deflate algorithm utilized both LZ77 and Huffman coding. After compressing the message text, LSB has been applied to embed the text within the cover image. Using benchmark images, the proposed method demonstrated an outperformance over the state of the art. This can proof the efficacy of using Deflate as a data compression prior to the LSB embedding.

DDCA-WSN: A Distributed Data Compression and Aggregation Approach for Low Resources Wireless Sensors Networks

Developing an efficient secure query processing algorithm on encrypted databases using data compression.

Abstract Distributed computing includes putting aside the data utilizing outsider storage and being able to get to this information from a place at any time. Due to the advancement of distributed computing and databases, high critical data are put in databases. However, the information is saved in outsourced services like Database as a Service (DaaS), security issues are raised from both server and client-side. Also, query processing on the database by different clients through the time-consuming methods and shared resources environment may cause inefficient data processing and retrieval. Secure and efficient data regaining can be obtained with the help of an efficient data processing algorithm among different clients. This method proposes a well-organized through an Efficient Secure Query Processing Algorithm (ESQPA) for query processing efficiently by utilizing the concepts of data compression before sending the encrypted results from the server to clients. We have addressed security issues through securing the data at the server-side by an encrypted database using CryptDB. Encryption techniques have recently been proposed to present clients with confidentiality in terms of cloud storage. This method allows the queries to be processed using encrypted data without decryption. To analyze the performance of ESQPA, it is compared with the current query processing algorithm in CryptDB. Results have proven the efficiency of storage space is less and it saves up to 63% of its space.

Telemetry Data Compression Algorithm Using Balanced Recurrent Neural Network and Deep Learning

Telemetric information is great in size, requiring extra room and transmission time. There is a significant obstruction of storing or sending telemetric information. Lossless data compression (LDC) algorithms have evolved to process telemetric data effectively and efficiently with a high compression ratio and a short processing time. Telemetric information can be packed to control the extra room and association data transmission. In spite of the fact that different examinations on the pressure of telemetric information have been conducted, the idea of telemetric information makes pressure incredibly troublesome. The purpose of this study is to offer a subsampled and balanced recurrent neural lossless data compression (SB-RNLDC) approach for increasing the compression rate while decreasing the compression time. This is accomplished through the development of two models: one for subsampled averaged telemetry data preprocessing and another for BRN-LDC. Subsampling and averaging are conducted at the preprocessing stage using an adjustable sampling factor. A balanced compression interval (BCI) is used to encode the data depending on the probability measurement during the LDC stage. The aim of this research work is to compare differential compression techniques directly. The final output demonstrates that the balancing-based LDC can reduce compression time and finally improve dependability. The final experimental results show that the model proposed can enhance the computing capabilities in data compression compared to the existing methodologies.

Lossless Genome Data Compression Using V-Gram

A deep learning scheme for efficient multimedia iot data compression, data compression algorithms for sensor networks with periodic transmission schemes.

The operating state of switch cabinet is significant for the reliability of the whole power system, collecting and monitoring its data through the wireless sensor network is an effective method to avoid accidents. This paper proposes a data compression method based on periodic transmission model under the condition of limited energy consumption and memory space resources in the complex environment of switch cabinet sensor networks. Then, the proposed method is rigorously and intuitively shown by theoretical derivation and algorithm flow chart. Finally, numerical simulations are carried out and compared with the original data. The comparisons of compression ratio and error results indicate that the improved algorithm has a better effect on the periodic sensing data with interference and can make sure the change trend of data by making certain timing sequence.

A New Transparent Cloud-Based Model for Sharing Medical Images with Data Compression and Proactive Resource Elasticity

Damage detection and localization under variable environmental conditions using compressed and reconstructed bayesian virtual sensor data.

Structural health monitoring (SHM) with a dense sensor network and repeated vibration measurements produces lots of data that have to be stored. If the sensor network is redundant, data compression is possible by storing the signals of selected Bayesian virtual sensors only, from which the omitted signals can be reconstructed with higher accuracy than the actual measurement. The selection of the virtual sensors for storage is done individually for each measurement based on the reconstruction accuracy. Data compression and reconstruction for SHM is the main novelty of this paper. The stored and reconstructed signals are used for damage detection and localization in the time domain using spatial or spatiotemporal correlation. Whitening transformation is applied to the training data to take the environmental or operational influences into account. The first principal component of the residuals is used to localize damage and also to design the extreme value statistics control chart for damage detection. The proposed method was studied with a numerical model of a frame structure with a dense accelerometer or strain sensor network. Only five acceleration or three strain signals out of the total 59 signals were stored. The stored and reconstructed data outperformed the raw measurement data in damage detection and localization.

Export Citation Format

Share document.

EE 274: Data Compression, Theory and Applications / Fall 2022-23

  • Fall 23 Course Website is up! Please visit https://stanforddatacompressionclass.github.io/Fall23 for more information.

Course Description

Welcome to EE 274, a class on data compression at Stanford! For latest iteration of this class, visit Fall 23 edition of the course.

The amount of data being generated, stored and communicated by humanity is growing at unprecedented rates, currently in the dozens of zettabytes (1 zettabyte = 1 trillion gigabytes) per year by the most conservative of estimates. Data compression, the field dedicated to representing information succinctly, is playing an increasingly critical role in enabling this growth. Progress in storage and communication technologies has led to enhanced capabilities, with a perpetual cat and mouse chase between growing the ability to handle more data and the amounts of it required by new technologies. We are all painfully aware of this conundrum as we run out of space on our phones due to the selfies, boomerang videos and documents we collect.

The goal of this course is to provide an understanding of how data compression enables representing all of this information in a succinct manner. Both theoretical and practical aspects of compression will be covered. A major component of the course is learning through doing - the students will work on a pedagogical data compression library and implement specific compression techniques.

The course structure is as follows

  • Part I: Lossless compression fundamentals : The first part of the course introduces fundamental techniques for entropy coding and for lossless compression, and the intuition behind why these techniques work. We will also discuss how the commonly used everyday tools such as GZIP, BZIP2 work.
  • Part II: Lossy compression The second part covers fundamental teqchniques from the area of lossy compression. Special focus will be on understanding current image and video coding techniques such as JPEG, BPG, H264, H265 . We will also discuss recent advances in the field of using machine learning for image/video compression.
  • Part III: Special topics The third part of the course focuses on providing exposure to the students to advanced theoretical topics and recent research advances in the field of compression. The topics will be decided based on student interest. Some topics of interest are: distributed compression, succinct data structures, computation & random access on compressed data, image/video compression for perceptual quality.

The course is suitable for both undergraduate and graduate students with basic probability and programming background. Please contact us if you are not sure if the course is for you!

1489772491_pied_piper___silicon_valley_wallpaper_by_famous1994-d8wth3j

Instructors

research topics on data compression

Kedar Tatwawadi

research topics on data compression

Shubham Chandak

research topics on data compression

Tsachy Weissman

Teaching Assistants

research topics on data compression

Pulkit Tandon

Academia.edu no longer supports Internet Explorer.

To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to  upgrade your browser .

Enter the email address you signed up with and we'll email you a reset link.

  • We're Hiring!
  • Help Center

paper cover thumbnail

A Research Paper on Lossless Data Compression Techniques

Profile image of IJIRST - International Journal for Innovative Research in Science and Technology

This research paper provides lossless data compression techniques and comparison between them. Data Compression is a process which reduces the size of data removing excessive information from it. It reduces the redundancy in data representation to decrease the storage required for that data and thus also reduces the communication cost by using the available bandwidth effectively. Data compression is important application in the area of file storage and distributed system. For different data formats like text, audio, video and image files there are different data compression techniques. Mainly there are two forms of data compression:-Lossy and Lossless. But in the lossless data compression, the integrity of data is to be preserved.

Related Papers

International Journal of Scientific Research in Computer Science, Engineering and Information Technology

International Journal of Scientific Research in Computer Science, Engineering and Information Technology IJSRCSEIT

This paper provides different kinds of techniques for lossless data compression and comparison between them. By eliminating redundant bits, data compression decreases the file size. In order to reduce the capacity needed for that data, it decreases the redundant bits in data representation and thus uses the bandwidth effectively to reduce the communication cost. Compression of data saves file volume, network bandwidth and speeds up the transfer speed as well. Lossless and Lossy are the two techniques for data compression. Lossless compression maintains the data properly.

research topics on data compression

Rohit Bathla

Compression reduces the number of bits required to represent the data. Compression is useful because it helps in reducing the consumption of expensive resources, such as disk space and transmission bandwidth. Compression is built into a broad range of technologies like storage systems, databases operating systems and software applications. Hence selection of data compression algorithm should be appropriate. This paper presents different data compression methodologies. Mainly there are two forms of data compression :Lossless and Lossy. In this paper, we discussed about some of the Lossless and Lossy data compression methods.

jitendra joshi

This research paper provides lossless data compression methodologies and compares their performance. Huffman and arithmetic coding are compare according to their performances. Data compression is a process that reduces the data size, removing the excessive information. Shorter data size is suitable because it simply reduces the cost. The aim of data compression is to reduce redundancy in stored or communicated data, thus increasing effective data density. Data compression is important application in the area of file storage and distributed system because in distributed system data have to send from and to all system. So for speed and performance efficiency data compression is used. There are number of different data compression methodologies, which are used to compress different data formats like text, video, audio, image files. There are two forms of data compression “lossy” and “lossless”, in lossless data compression, the integrity of data is preserved.

International Journal of Engineering Research and

Dr. Kruti Dangarwala

Jeegar Trivedi

Compression is useful because it helps us to reduce the resources usage, such as data storage space or transmission capacity. Data Compression is the technique of representing information in a compacted form. The actual aim of data compression is to be reduced redundancy in stored or communicated data, as well as increasing effectively data density. The data compression has important tool for the areas of file storage and distributed systems. To desirable Storage space on disks is expensively so a file which occupies less disk space is “cheapest” than an uncompressed files. The main purpose of data compression is asymptotically optimum data storage for all resources. The field data compression algorithm can be divided into different ways: lossless data compression and optimum lossy data compression as well as storage areas. Basically there are so many Compression methods available, which have a long list. In this paper, reviews of different basic lossless data and lossy compression ...

International Journal of Computer Applications

Dr. Pooja Raundale

International Journal of Engineering Research and Technology (IJERT)

IJERT Journal

https://www.ijert.org/survey-of-lossless-data-compression-algorithms https://www.ijert.org/research/survey-of-lossless-data-compression-algorithms-IJERTV4IS040926.pdf The main goal of data compression is to decrease redundancy in warehouse or communicated data, so growing effective data density. It is a common necessary for most of the applications. Data compression is very important relevancy in the area of file storage and distributed system just because of in distributed system data have to send from and to all system. Two configuration of data compression are there "lossy" and "lossless". But in this paper we only focus on Lossless data compression techniques. In lossless data compression, the wholeness of data is preserved. Data compression is a technique that decreases the data size, removing the extreme information. Data compression has many types of techniques that decrease redundancy. The methods which mentioned are Run Length Encoding, Shannon Fanon, Huffman, Arithmetic, adaptive Huffman, LZ77, LZ78 and LZW with its performance.

Nigerian Journal of Technological Development

Data compression is the process of reducing the size of a file to effectively reduce storage space and communication cost. The evolvement in technology and digital age has led to an unparalleled usage of digital files in this current decade. The usage of data has resulted to an increase in the amount of data being transmitted via various channels of data communication which has prompted the need to look into the current lossless data compression algorithms to check for their level of effectiveness so as to maximally reduce the bandwidth requirement in communication and transfer of data. Four lossless data compression algorithm: Lempel-Ziv Welch algorithm, Shannon-Fano algorithm, Adaptive Huffman algorithm and Run-Length encoding have been selected for implementation. The choice of these algorithms was based on their similarities, particularly in application areas. Their level of efficiency and effectiveness were evaluated using some set of predefined performance evaluation metrics namely compression ratio, compression factor, compression time, saving percentage, entropy and code efficiency. The algorithms implementation was done in the NetBeans Integrated Development Environment using Java as the programming language. Through the statistical analysis performed using Boxplot and ANOVA and comparison made on the four algorithms, Lempel Ziv Welch algorithm was the most efficient and effective based on the metrics used for evaluation.

Alan Janson , Vinayak Bhogan

With increasing need to store data in lesser memory several lossless compression techniques are developed. This paper intends to provide the performance analysis of lossless compression techniques over various parameters like compression ratio, delay in processing, size of image etc. It provides the relevant data about variations in them as well as to describe the possible causes for it. It describes the basic lossless techniques as Huffman encoding, run length encoding, arithmetic encoding and Lempel-ziv-welch encoding briefly with their effectiveness under varying parameters. Considering the simulation results of grayscale image compression achieved in MATLAB software, it also focused to propose the possible reasons behind differences in comparison.

Journal of Computer Science IJCSIS

This paper present survey of several lossless data compression techniques and its corresponding algorithms. A set of selected algorithms are studied and examined. This paper concluded by stating which algorithm performs well for text data. https://sites.google.com/site/ijcsis/

RELATED PAPERS

Kadayam S. Subramanian

THE 4TH BIOMEDICAL ENGINEERING’S RECENT PROGRESS IN BIOMATERIALS, DRUGS DEVELOPMENT, HEALTH, AND MEDICAL DEVICES: Proceedings of the International Symposium of Biomedical Engineering (ISBE) 2019

Bambang Iskandriawan

Revista Latinoamericana de Filosofía

Ezequiel Zerbudis

Yuyun Maryuningsih

Hamid AKDIM

Nuclear Physics B

Geoffrey Grayer

Journal of Evolution of Medical and Dental Sciences

Jyothi A Raj

Hester Bovenkamp

Jorge Pedro Sousa

Revista Diálogos do Direito

Angela Kretschmann

CABI eBooks

Desmond Layne

Christine Girou

2017 Computing in Cardiology Conference (CinC)

Kjell Nikus

Anatolie Scorpan

Journal of the Croatian Association of Civil Engineers

Domagoj Nakić

Clinical Epidemiology

Luis Sierrasesumaga

Physical review letters

Pinghan Chu

Mandi Pratt-Chapman

Measurement

Ludger Koenders

Akomaye A . Ferdinand

Semina-ciencias Agrarias

Caio Abercio da Silva

RELATED TOPICS

  •   We're Hiring!
  •   Help Center
  • Find new research papers in:
  • Health Sciences
  • Earth Sciences
  • Cognitive Science
  • Mathematics
  • Computer Science
  • Academia ©2024

Northwestern Scholars Logo

  • Help & FAQ

Data compression techniques for stock market prediction

  • Computer Science

Research output : Chapter in Book/Report/Conference proceeding › Conference contribution

This paper presents advanced data compression techniques for predicting stock markets behavior under widely accepted market models in finance. Our techniques are applicable to technical analysis, portfolio theory, and nonlinear market models. We find that lossy and lossless compression techniques are well suited for predicting stock prices as well as market modes such as strong trends and major adjustments. We also present novel applications of multispectral compression techniques to portfolio theory, correlation of similar stocks, effects of interest rates, transaction costs and taxes.

Publication series

Asjc scopus subject areas.

  • Computer Networks and Communications

Other files and links

  • Link to publication in Scopus
  • Link to the citations in Scopus

Fingerprint

  • data INIS 100%
  • prediction INIS 100%
  • market INIS 100%
  • compression INIS 100%
  • stocks INIS 100%
  • Prediction Market Economics, Econometrics and Finance 100%
  • Market Economics, Econometrics and Finance 100%
  • Stock Economics, Econometrics and Finance 100%

T1 - Data compression techniques for stock market prediction

AU - Azhar, Salman

AU - Badros, Greg J.

AU - Glodjo, Arman

AU - Kao, Ming Yang

AU - Reif, John H.

N2 - This paper presents advanced data compression techniques for predicting stock markets behavior under widely accepted market models in finance. Our techniques are applicable to technical analysis, portfolio theory, and nonlinear market models. We find that lossy and lossless compression techniques are well suited for predicting stock prices as well as market modes such as strong trends and major adjustments. We also present novel applications of multispectral compression techniques to portfolio theory, correlation of similar stocks, effects of interest rates, transaction costs and taxes.

AB - This paper presents advanced data compression techniques for predicting stock markets behavior under widely accepted market models in finance. Our techniques are applicable to technical analysis, portfolio theory, and nonlinear market models. We find that lossy and lossless compression techniques are well suited for predicting stock prices as well as market modes such as strong trends and major adjustments. We also present novel applications of multispectral compression techniques to portfolio theory, correlation of similar stocks, effects of interest rates, transaction costs and taxes.

UR - http://www.scopus.com/inward/record.url?scp=0028135782&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0028135782&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:0028135782

SN - 0818656379

T3 - Proceedings of the Data Compression Conference

BT - Proceedings of the Data Compression Conference

A2 - Storer, James A.

A2 - Cohn, Martin

PB - Publ by IEEE

T2 - Proceedings of the Data Compression Conference

Y2 - 29 March 1994 through 31 March 1994

Stanford University

Along with Stanford news and stories, show me:

  • Student information
  • Faculty/Staff information

We want to provide announcements, events, leadership messages and resources that are relevant to you. Your selection is stored in a browser cookie which you can remove at any time using “Clear all personalization” below.

Image credit: Claire Scully

New advances in technology are upending education, from the recent debut of new artificial intelligence (AI) chatbots like ChatGPT to the growing accessibility of virtual-reality tools that expand the boundaries of the classroom. For educators, at the heart of it all is the hope that every learner gets an equal chance to develop the skills they need to succeed. But that promise is not without its pitfalls.

“Technology is a game-changer for education – it offers the prospect of universal access to high-quality learning experiences, and it creates fundamentally new ways of teaching,” said Dan Schwartz, dean of Stanford Graduate School of Education (GSE), who is also a professor of educational technology at the GSE and faculty director of the Stanford Accelerator for Learning . “But there are a lot of ways we teach that aren’t great, and a big fear with AI in particular is that we just get more efficient at teaching badly. This is a moment to pay attention, to do things differently.”

For K-12 schools, this year also marks the end of the Elementary and Secondary School Emergency Relief (ESSER) funding program, which has provided pandemic recovery funds that many districts used to invest in educational software and systems. With these funds running out in September 2024, schools are trying to determine their best use of technology as they face the prospect of diminishing resources.

Here, Schwartz and other Stanford education scholars weigh in on some of the technology trends taking center stage in the classroom this year.

AI in the classroom

In 2023, the big story in technology and education was generative AI, following the introduction of ChatGPT and other chatbots that produce text seemingly written by a human in response to a question or prompt. Educators immediately worried that students would use the chatbot to cheat by trying to pass its writing off as their own. As schools move to adopt policies around students’ use of the tool, many are also beginning to explore potential opportunities – for example, to generate reading assignments or coach students during the writing process.

AI can also help automate tasks like grading and lesson planning, freeing teachers to do the human work that drew them into the profession in the first place, said Victor Lee, an associate professor at the GSE and faculty lead for the AI + Education initiative at the Stanford Accelerator for Learning. “I’m heartened to see some movement toward creating AI tools that make teachers’ lives better – not to replace them, but to give them the time to do the work that only teachers are able to do,” he said. “I hope to see more on that front.”

He also emphasized the need to teach students now to begin questioning and critiquing the development and use of AI. “AI is not going away,” said Lee, who is also director of CRAFT (Classroom-Ready Resources about AI for Teaching), which provides free resources to help teach AI literacy to high school students across subject areas. “We need to teach students how to understand and think critically about this technology.”

Immersive environments

The use of immersive technologies like augmented reality, virtual reality, and mixed reality is also expected to surge in the classroom, especially as new high-profile devices integrating these realities hit the marketplace in 2024.

The educational possibilities now go beyond putting on a headset and experiencing life in a distant location. With new technologies, students can create their own local interactive 360-degree scenarios, using just a cell phone or inexpensive camera and simple online tools.

“This is an area that’s really going to explode over the next couple of years,” said Kristen Pilner Blair, director of research for the Digital Learning initiative at the Stanford Accelerator for Learning, which runs a program exploring the use of virtual field trips to promote learning. “Students can learn about the effects of climate change, say, by virtually experiencing the impact on a particular environment. But they can also become creators, documenting and sharing immersive media that shows the effects where they live.”

Integrating AI into virtual simulations could also soon take the experience to another level, Schwartz said. “If your VR experience brings me to a redwood tree, you could have a window pop up that allows me to ask questions about the tree, and AI can deliver the answers.”

Gamification

Another trend expected to intensify this year is the gamification of learning activities, often featuring dynamic videos with interactive elements to engage and hold students’ attention.

“Gamification is a good motivator, because one key aspect is reward, which is very powerful,” said Schwartz. The downside? Rewards are specific to the activity at hand, which may not extend to learning more generally. “If I get rewarded for doing math in a space-age video game, it doesn’t mean I’m going to be motivated to do math anywhere else.”

Gamification sometimes tries to make “chocolate-covered broccoli,” Schwartz said, by adding art and rewards to make speeded response tasks involving single-answer, factual questions more fun. He hopes to see more creative play patterns that give students points for rethinking an approach or adapting their strategy, rather than only rewarding them for quickly producing a correct response.

Data-gathering and analysis

The growing use of technology in schools is producing massive amounts of data on students’ activities in the classroom and online. “We’re now able to capture moment-to-moment data, every keystroke a kid makes,” said Schwartz – data that can reveal areas of struggle and different learning opportunities, from solving a math problem to approaching a writing assignment.

But outside of research settings, he said, that type of granular data – now owned by tech companies – is more likely used to refine the design of the software than to provide teachers with actionable information.

The promise of personalized learning is being able to generate content aligned with students’ interests and skill levels, and making lessons more accessible for multilingual learners and students with disabilities. Realizing that promise requires that educators can make sense of the data that’s being collected, said Schwartz – and while advances in AI are making it easier to identify patterns and findings, the data also needs to be in a system and form educators can access and analyze for decision-making. Developing a usable infrastructure for that data, Schwartz said, is an important next step.

With the accumulation of student data comes privacy concerns: How is the data being collected? Are there regulations or guidelines around its use in decision-making? What steps are being taken to prevent unauthorized access? In 2023 K-12 schools experienced a rise in cyberattacks, underscoring the need to implement strong systems to safeguard student data.

Technology is “requiring people to check their assumptions about education,” said Schwartz, noting that AI in particular is very efficient at replicating biases and automating the way things have been done in the past, including poor models of instruction. “But it’s also opening up new possibilities for students producing material, and for being able to identify children who are not average so we can customize toward them. It’s an opportunity to think of entirely new ways of teaching – this is the path I hope to see.”

Advertisement

Advertisement

An Improved Data Compression Framework for Wireless Sensor Networks Using Stacked Convolutional Autoencoder (S-CAE)

  • Original Research
  • Published: 24 May 2023
  • Volume 4 , article number  419 , ( 2023 )

Cite this article

research topics on data compression

  • Lithin Kumble 1 &
  • Kiran Kumari Patil 2  

133 Accesses

10 Citations

Explore all metrics

Data compression is crucial in the networks as there is limited energy which is accessible to sensor nodes in wireless sensor networks (WSNs). The sensor nodes lifetime is extended enormously by reducing the data reception and transmission. We introduce a stacked convolutional RBM auto-encoder (stacked CAE) model for compressing sensor data, which is made up of layers: an encode layer and a decode layer, both of which are discussed. The encode layer is used to compress and decompress data from sensor and then, the decode layer is used for reconstruction and compression of the data from sensors. Both encode and decode layers are comprised of four standard restricted Boltzmann machines, which are employed throughout the system. This work focuses on energy reduction technique by reduction of model’s parameters which in turn reduces the model’s calculation and storage energy. The model's effectiveness is evaluated against the Intel Lab data. The average temperature reconstruction inaccuracy is 0.312 °C, the average percentage RMS difference is 9.84%, and the figures imply that the model's compression ratio is 10. Hence, there is a possibility of minimizing the consumption of energy of node communication in WSNs by 92%. The new model attains higher compression proficiency and remark precision while keeping the same pressure ratio when in comparison with the previous method.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price includes VAT (Russian Federation)

Instant access to the full article PDF.

Rent this article via DeepDyve

Institutional subscriptions

research topics on data compression

Similar content being viewed by others

research topics on data compression

Image Compression in Wireless Sensor Networks Using Autoencoder and RBM Method

research topics on data compression

Extract Features Using Stacked Denoised Autoencoder

research topics on data compression

Hyperspectral Image Classification Using Denoised Stacked Auto Encoder-Based Restricted Boltzmann Machine Classifier

Lazarescu MT. Design of a WSN platform for long-term environmental monitoring for IoT applications. J Emerg Sel Top Circuits Syst. 2013;3:45–54.

Article   Google Scholar  

Janai J, Güney F, Behl A, Geiger A. Computer vision for autonomous vehicles: Problems, datasets and state-of-the-art. 2017. arXiv:1704.05519 .

Kuo TW, Lin KCJ, Tsai MJ. On the construction of data aggregation tree with minimum energy cost in wireless sensor networks: NP-completeness and approximation algorithms. IEEE Trans Comput. 2016;65:3109–21.

Article   MathSciNet   MATH   Google Scholar  

Mcdonald D, Sanchez S, Madria S, Ercal F. A survey of methods for finding outliers in wireless sensor networks. J Netw Syst Manag. 2015;23:163–82.

He S, Chen J, Yau DKY, Sun Y. Cross-layer optimization of correlated data gathering in wireless sensor networks. IEEE Trans Mob Comput. 2012;11:1678–91.

Keogh E, Chakrabarti K, Pazzani M, Mehrotra S. Locally adaptive dimensionality reduction for indexing large time series databases. ACM Sigmod Rec. 2001;30:151–62.

Article   MATH   Google Scholar  

Buragohain C, Shrivastava N, Suri S. Space efficient streaming algorithms for the maximum error histogram. In: Proceedings of the IEEE 23rd international conference on data engineering, Istanbul, 16–20 April 2007. p. 1026–1035.

Li M, Lin HJ. Design and implementation of smart home control systems based on wireless sensor networks and power line communications. IEEE Trans Ind Electron. 2015;62:4430–42.

Donoho DL. Compressed sensing. IEEE Trans Inf Theory. 2006;52:1289–306.

Eldar YC, Kutyniok G. Compressed sensing: theory and applications. Cambridge: Cambridge University Press; 2012. p. 1289–306.

Book   Google Scholar  

Candès EJ, Wakin MB. An introduction to compressive sampling. IEEE Signal Process Mag. 2008;25:21–30.

Zhang Z, Xu Y, Yang J, Li X, Zhang D. A survey of sparse representation: algorithms and applications. IEEE Access. 2015;3:490–530.

Haupt J, Bajwa WU, Rabbat M, Nowak R. Compressed sensing for networked data. IEEE Signal Process Mag. 2008;25:92–101.

Caione C, Brunelli D, Benini L. Distributed compressive sampling for lifetime optimization in dense wireless sensor networks. IEEE Trans Ind Inform. 2012;8:30–40.

Ranieri J, Rovatti R, Setti G. Compressive sensing of localized signals: Application to analog-to-information conversion. In: Proceedings of the 2010 IEEE international symposium on circuits and systems (ISCAS), Paris, 30 May–2 June 2010. p. 3513–3516.

Brunelli D, Caione C. Sparse recovery optimization in wireless sensor networks with a sub-Nyquist sampling rate. Sensors. 2015;15:16654–73.

Xiang L, Luo J, Vasilakos A. Compressed data aggregation for energy efficient wireless sensor networks. In: Proceedings of the 2011 8th annual IEEE communications society conference on sensor, mesh and ad hoc communications and networks (SECON), Salt Lake City, 27–30 June 2011. p. 46–54.

Li S, Xu LD, Wang X. Compressed sensing signal and data acquisition in wireless sensor networks and internet of things. IEEE Trans Ind Inform. 2013;9:2177–86.

Wu M, Tan L, Xiong N. Data prediction, compression, and recovery in clustered wireless sensor networks for environmental monitoring applications. Inf Sci. 2016;329:800–18.

Sheltami T, Musaddiq M, Shakshuki E. Data compression techniques in wireless sensor networks. Future Gener Comput Syst. 2016;64:151–62.

Bhosale RB, Jagtap RR. Data compression algorithm for wireless sensor network. Int Res J Multidiscip Stud. 2016;2:1–6.

Google Scholar  

Ying B. An energy-efficient compression algorithm for spatial data in wireless sensor networks. In: Proceedings of the 18th IEEE international conference on advanced communications technology, PyeongChang, 31 January–3 February 2016. p. 161–164.

Hinton GE, Salakhutdinov RR. Reducing the dimensionality of data with neural networks. Science. 2006;313:504–7.

Mousavi A, Patel AB, Baraniuk RG. A deep learning approach to structured signal recovery. In: Proceedings of the 2015 53rd annual Allerton conference on communication, control, and computing, Monticello, 29 September–2 Octorber 2015. p. 1336–1343.

Qiu L, Liu TJ, Fu P. Data fusion in wireless sensor network based on sparse filtering. J Electron Meas Instrum. 2015;3:352–7.

Yildirim O, San TR, Acharya UR. An efficient compression of ECG signals using deep convolutional autoencoders. Cogn Syst Res. 2018;52:198–211.

Norouzi M, Ranjbar M, Mori G. Stacks of convolutional restricted boltzmann machines for shift-invariant feature learning. In: Proceedings of the 2009 IEEE conference on computer vision and pattern recognition (CVPR 2009), Miami, 20–25 June 2009. p. 2735–2742.

Hinton GE, Salakhutdinov RR. A better way to pretrain deep Boltzmann machines. In: Proceedings of the twenty-sixth conference on neural information processing systems, Lake Tahoe, 3–8 December 2012. p. 2447–2455.

Tramel EW, Manoel A, Caltagirone F, Gabrié M, Krzakala F. Inferring sparsity: compressed sensing using generalized restricted Boltzmann machines. In: Proceedings of the 2016 IEEE information theory workshop (ITW), Cambridge, 11–14 September 2016. p. 265–269.

Papa JP, Rosa GH, Marana AN, Scheirer W, Cox DD. Model selection for discriminative restricted Boltzmann machines through meta-heuristic techniques. J Comput Sci. 2015;9:14–8.

Tomczak JM, Zieba M. Classification restricted Boltzmann machine for comprehensible credit scoring model. Expert Syst Appl. 2015;42:1789–96.

Carreira-Perpinan MA, Hinton GE. On contrastive divergence learning. Aistats. 2005;10:33–40.

Tulder GV, Bruijne MD. Combining generative and discriminative representation learning for lung CT analysis with convolutional restricted Boltzmann machines. IEEE Trans Med Imaging. 2016;35:1262–72.

Côté MA, Larochelle H. An infinite restricted Boltzmann machine. Neural Comput. 2016;28:1265–88.

Rumelhart DE, Hinton GE, Williams RJ. Learning representations by back-propagating errors. Nature. 1986;323:533–6.

Salakhutdinov R, Mnih A, Hinton G. Restricted Boltzmann machines for collaborative filtering. In: Proceedings of the 24th international conference on machine learning, Corvalis, 20–24 June 2007. p. 791–798.

Smith LN. Cyclical learning rates for training neural networks. In: Proceedings of the 2017 IEEE winter conference on applications of computer vision (WACV), Santa Rosa, 24–31 March 2017. p. 464–472.

Li H, Kadav A, Durdanovic I, Samet H, Graf HP. Pruning filters for efficient convnets. 2016. arXiv:1608.08710 .

Han S, Pool J, Tran J, Dally W. Learning both weights and connections for efficient neural network. In: Proceedings of the twenty-ninth conference on neural information processing systems, Montréal, 7–12 December 2015. p. 1135–1143.

Download references

No funding received for this research.

Author information

Authors and affiliations.

School of C& IT, REVA University, Bengaluru, Karnataka, 560064, India

Lithin Kumble

School of CSE, REVA University, Bengaluru, Karnataka, 560064, India

Kiran Kumari Patil

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Lithin Kumble .

Ethics declarations

Conflict of interest.

There is no conflict of interest.

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

This article is part of the topical collection “Advances in Computational Approaches for Image Processing, Wireless Networks, Cloud Applications and Network Security” guest edited by P. Raviraj, Maode Ma and Roopashree H R.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Kumble, L., Patil, K.K. An Improved Data Compression Framework for Wireless Sensor Networks Using Stacked Convolutional Autoencoder (S-CAE). SN COMPUT. SCI. 4 , 419 (2023). https://doi.org/10.1007/s42979-023-01845-7

Download citation

Received : 22 March 2023

Accepted : 12 April 2023

Published : 24 May 2023

DOI : https://doi.org/10.1007/s42979-023-01845-7

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Data compression
  • Stacked-CAE
  • Transfer learning
  • Energy, consumption optimization
  • Find a journal
  • Publish with us
  • Track your research

U.S. flag

A .gov website belongs to an official government organization in the United States.

A lock ( ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.

  • Signs and Symptoms
  • How It Spreads
  • Testing and Diagnosis
  • Surveillance and Data
  • Educational Materials
  • Clinical Testing and Diagnosis
  • Clinical Care and Treatment
  • Continuing Education
  • Clinical Resources
  • Vector-Borne Diseases

Lyme Disease Surveillance and Data

Over 63,000 cases of Lyme disease were reported to CDC by state health departments and the District of Columbia in 2022. This number reflects cases reported through routine national surveillance, which is only one way public health officials track diseases. Recent estimates using other methods suggest that approximately 476,000 people may be diagnosed and treated for Lyme disease each year in the United States. This number likely includes patients who are treated based on clinical suspicion but do not actually have Lyme disease.

Female epidemiologist working on data on computer.

Surveillance data explained

Lyme disease has been a nationally notifiable condition in the United States since 1991. Reports of Lyme disease are routinely collected and verified by state and local health departments in accordance with their legal mandate and surveillance practices. After removal of personal identifiers, selected information on cases is shared with CDC through the National Notifiable Diseases Surveillance System (NNDSS) . Policies regarding case definitions, reporting, confidentiality, and data release are determined by states and territories under the auspices of the Council of State and Territorial Epidemiologists (CSTE). Surveillance data have a number of limitations that need to be considered in the analysis, interpretation, and reporting of results.

Limitations of surveillance data

  • Under-reporting and misclassification are features common to all surveillance systems. Not every case of Lyme disease is reported to CDC, and some cases that are reported may be due to another cause.
  • Surveillance data are captured by county of residence, not county of exposure.
  • States may close their annual surveillance dataset at a different time than CDC. Thus, the final case counts published by CDC may not exactly match numbers published by each state agency for a given year.
  • Following its implementation in 1991, the national surveillance case definition for Lyme disease was modified in 1996, 2008, 2011, 2017, and again in 2022. Some of these changes impacted surveillance data and must be considered when attempting to interpret trends. Case definitions for each period are available .

Available data

  • Interactive graphs and tables of reported Lyme disease data
  • Interactive map of reported Lyme disease data
  • CDC WONDER tables

Alternative data sources

CDC is currently working to establish enhanced surveillance and research platforms for Lyme disease using electronic health records (EHRs) from large healthcare systems in areas of the U.S. with a high incidence of Lyme disease. The Surveillance Based Lyme Disease Network is comprised of partners in healthcare systems from Maine, Massachusetts, Pennsylvania, and Wisconsin. CDC will use these EHR data in concert with traditional public health surveillance data to better understand how Lyme disease affects the American public.

Lyme Disease

Lyme disease is caused by Borrelia bacteria spread to people by the bite of an infected blacklegged tick.

For Everyone

Health care providers.

Numbers, Facts and Trends Shaping Your World

Read our research on:

Full Topic List

Regions & Countries

  • Publications
  • Our Methods
  • Short Reads
  • Tools & Resources

Read Our Research On:

  • Half of Latinas Say Hispanic Women’s Situation Has Improved in the Past Decade and Expect More Gains

Government data shows gains in education, employment and earnings for Hispanic women, but gaps with other groups remain

Table of contents.

  • Assessing the progress of Hispanic women in the last 10 years
  • Views of Hispanic women’s situation in the next 10 years
  • Views on the gender pay gap
  • Latinas’ educational attainment
  • Latinas’ labor force participation
  • Latinas’ earnings
  • Latinas as breadwinners in their relationships
  • Bachelor’s degrees among Latinas
  • Labor force participation rates among Latinas
  • Occupations among working Latinas
  • Earnings among Latinas
  • Latinas as breadwinners in 2022
  • Appendix: Supplemental charts and tables
  • Acknowledgments
  • The American Trends Panel survey methodology
  • Methodology for the analysis of the Current Population Survey

This report explores Latinas’ economic and demographic progress in the last two decades – and their perceptions of that progress – using several data sources.

The first is a Pew Research Center survey of 5,078 Hispanic adults, including 2,600 Hispanic women. Respondents were asked whether U.S. Latinas saw progress in their situation in the last decade, whether they expected any in the future decade, and how big a problem the U.S. gender pay gap is. The survey was conducted from Nov. 6 to 19, 2023, and includes 1,524 respondents from the American Trends Panel (ATP) and an additional 3,554 from Ipsos’ KnowledgePanel .

Respondents on both panels are recruited through national, random sampling of residential addresses. Recruiting panelists by mail ensures that nearly all U.S. adults have a chance of selection. This gives us confidence that any sample can represent the whole population, or in this case the whole U.S. Hispanic population. (For more information, watch our Methods 101 explainer on random sampling.) For more information on this survey, refer to the American Trends Panel survey methodology and the topline questionnaire .

The second data source is the U.S. Census Bureau’s and Bureau of Labor Statistics’ 2003, 2008, 2013, 2018 and 2023 Current Population Survey (CPS) Monthly and Annual Social and Economic Supplement (ASEC) data series, provided through the Integrated Public Use Microdata Series (IPUMS) from the University of Minnesota.

The CPS Monthly microdata series was used only to calculate median hourly earnings for those ages 25 to 64 years old and who were not self-employed. Medians were calculated for the whole year by considering all wages reported in that year, regardless of month. Median wages were then adjusted to June 2023 dollars using the Chained Consumer Price Index for All Urban Consumers for June of each year. For more information on the demographic analysis, refer to the methodology for the analysis of the Current Population Survey .

The terms  Hispanic  and  Latino  are used interchangeably in this report.

The terms Latinas and Hispanic women are used interchangeably throughout this report to refer to U.S. adult women who self-identify as Hispanic or Latino, regardless of their racial identity.

Foreign born  refers to persons born outside of the 50 U.S. states or the District of Columbia. For the purposes of this report, foreign born also refers to those born in Puerto Rico. Although individuals born in Puerto Rico are U.S. citizens by birth, they are grouped with the foreign born because they are born into a Spanish-dominant culture and because on many points their attitudes, views and beliefs are much closer to those of Hispanics born outside the U.S. than to Hispanics born in the 50 U.S. states or D.C., even those who identify themselves as being of Puerto Rican origin.

The terms  foreign born  and  immigrant  are used interchangeably in this report. Immigrants are also considered first-generation Americans.

U.S. born  refers to persons born in the 50 U.S. states or D.C.

Second generation  refers to people born in the 50 U.S. states or D.C. with at least one immigrant parent.

Third or higher generation  refers to people born in the 50 U.S. states or D.C., with both parents born in the 50 U.S. states or D.C.

Throughout this report, Democrats are respondents who identify politically with the Democratic Party or those who are independent or identify with some other party but lean toward the Democratic Party. Similarly, Republicans are those who identify politically with the Republican Party and those who are independent or identify with some other party but lean toward the Republican Party.

White, Black  and  Asian each include those who report being only one race and are not Hispanic.

Civilians are those who were not in the armed forces at the time of completing the Current Population Survey.

Those participating in the labor force either were at work; held a job but were temporarily absent from work due to factors like vacation or illness; were seeking work; or were temporarily laid off from a job in the week before taking the Current Population Survey. In this report, the labor force participation rate is shown only for civilians ages 25 to 64.

The phrases living with children or living with their own child describe individuals living with at least one of their own stepchildren, adopted children or biological children, regardless of the children’s ages. The phrases not living with children or not living with their own child describe individuals who have no children or whose children do not live with them.

Occupation and occupational groups describe the occupational category of someone’s current job, or – if unemployed – most recent job. In this report we measure occupation among civilians participating in the labor force. Occupational groups are adapted from the U.S. Census Bureau’s occupation classification list from 2018 onward .

Hourly earnings , hourly wages and hourly pay all refer to the amount an employee reported making per hour at the time of taking the Current Population Survey where they were employed by someone else. Median hourly wages were calculated only for those ages 25 to 64 who were not self-employed. Calculated median hourly wages shared in this report are adjusted for inflation to 2023. (A median means that half of a given population – for example, Hispanic women – earned more than the stated wage, and half earned less.)

Breadwinners refer to those living with a spouse or partner, both ages 25 to 64, who make over 60% of their and their partner’s combined, positive income from all sources. Those in egalitarian relationships make 40% to 60% of the combined income. For those who make less than 40% of the combined income, their spouse or partner is the breadwinner . This analysis was conducted among both opposite-sex and same-sex couples.

Half of Latinas say the situation of Hispanic women in the United States is better now than it was 10 years ago, and a similar share say the situation will improve in the next 10 years.

Bar charts showing that half of Latinas say the situation of U.S. Hispanic women has improved, yet two-thirds say the gender pay gap is a big problem for Hispanic women today. Half of Latinas also say they expect the situation of Hispanic women in the country to improve in the next ten years.

Still, 39% of Latinas say that the situation has stayed the same, and 34% say it will not change in the next 10 years. Two-thirds (66%) say the gender pay gap – the fact that women earn less money, on average, than men – is a big problem for Hispanic women today, according to new analysis of Pew Research Center’s National Survey of Latinos.

At 22.2 million, Latinas account for 17% of all adult women in the U.S. today. Their population grew by 5.6 million from 2010 to 2022, the largest numeric increase of any major female racial or ethnic group. 1

Latinas’ mixed assessments reflect their group’s gains in education and at work over the last two decades, but also stalled progress in closing wage gaps with other groups.

  • Hispanic women are more likely to have a bachelor’s degree today (23% in 2023) than they were in 2013 (16%). More Hispanic women than ever are also completing graduate degrees .
  • Hispanic women have increased their labor force participation rate by 4 percentage points, from 65% in 2013 to 69% in 2023.
  • The median hourly wage of Hispanic women has increased by 17% in the last decade. In 2023, their median hourly wage was $19.23, up from $16.47 in 2013 (in 2023 dollars).

Despite this progress, Hispanic women’s pay gaps with their peers haven’t significantly improved in recent years:

  • The gender pay gap among Hispanics persists with no significant change. In 2023, Hispanic women earned 85 cents (at the median) for every dollar earned by Hispanic men, compared with 89 cents per dollar in 2013 (and 87 cents per dollar in 2003).
  • Hispanic women continue to lag non-Hispanic women in earnings , with no significant improvement in the past decade. In 2023, the median Hispanic woman made 77 cents for each dollar earned by the median non-Hispanic woman, compared with 75 cents per dollar in 2013.
  • The pay gap between Hispanic women and White men has changed only slightly . In 2023, Hispanic women earned 62 cents of every dollar earned by non-Hispanic White men, up from 59 cents per dollar in 2013.

In addition, Hispanic women lag Hispanic men and non-Hispanic women in labor force participation, and they lag non-Hispanic women in educational attainment. Read more in Chapter 2 .

Among Latinas who are employed, about half (49%) say their current job is best described as “just a job to get them by.” Fewer see their job as a career (30%) or a steppingstone to a career (14%).

Pew Research Center’s bilingual 2023 National Survey of Latinos – conducted Nov. 6-19, 2023, among 5,078 Hispanic adults, including 2,600 Hispanic women – explores what it’s like to be a Latina in the U.S. today. This report uses findings from our 2023 survey as well as demographic and economic data from the Current Population Survey.

The following chapters take a closer look at:

  • How Latinas view the progress and future situation of Hispanic women in the U.S.
  • What government data tells us about Latinas’ progress in the labor market, earnings and educational attainment
  • How Latinas’ educational and economic outcomes vary

For additional survey findings on what it means to be a Latina in the U.S. today and the daily pressures they face, read our report “A Majority of Latinas Feel Pressure To Support Their Families or To Succeed at Work.”

  • Latinas’ population size and growth rate from 2010 to 2022 were calculated using the 2010 and 2022 American Community Surveys, accessed through IPUMS. The rest of the demographic analysis in this post uses data from the Current Population Survey. ↩

Sign up for our weekly newsletter

Fresh data delivery Saturday mornings

Sign up for The Briefing

Weekly updates on the world of news & information

  • Economics, Work & Gender
  • Education & Gender
  • Educational Attainment
  • Gender & Work
  • Gender Equality & Discrimination
  • Gender Pay Gap
  • Higher Education
  • Hispanics/Latinos
  • Hispanics/Latinos & Education

Key facts about U.S. Latinos with graduate degrees

Hispanic enrollment reaches new high at four-year colleges in the u.s., but affordability remains an obstacle, u.s. public school students often go to schools where at least half of their peers are the same race or ethnicity, what’s behind the growing gap between men and women in college completion, for u.s. latinos, covid-19 has taken a personal and financial toll, most popular, report materials.

1615 L St. NW, Suite 800 Washington, DC 20036 USA (+1) 202-419-4300 | Main (+1) 202-857-8562 | Fax (+1) 202-419-4372 |  Media Inquiries

Research Topics

  • Age & Generations
  • Coronavirus (COVID-19)
  • Economy & Work
  • Family & Relationships
  • Gender & LGBTQ
  • Immigration & Migration
  • International Affairs
  • Internet & Technology
  • Methodological Research
  • News Habits & Media
  • Non-U.S. Governments
  • Other Topics
  • Politics & Policy
  • Race & Ethnicity
  • Email Newsletters

ABOUT PEW RESEARCH CENTER  Pew Research Center is a nonpartisan fact tank that informs the public about the issues, attitudes and trends shaping the world. It conducts public opinion polling, demographic research, media content analysis and other empirical social science research. Pew Research Center does not take policy positions. It is a subsidiary of  The Pew Charitable Trusts .

Copyright 2024 Pew Research Center

IMAGES

  1. Data Compression: Making the Big Smaller and Faster (Part 1)

    research topics on data compression

  2. What are data compression techniques?

    research topics on data compression

  3. Challenges in Data Compression Research

    research topics on data compression

  4. Data Compression, Lossy, and Lossless compression

    research topics on data compression

  5. Basic Working Diagram of Data Compression .

    research topics on data compression

  6. Frontiers

    research topics on data compression

VIDEO

  1. 1. Data Compression Introduction

  2. 1. Introduction

  3. L-1

  4. Lecture

  5. Edexcel GCSE Computer Science: Data Storage and Compression

  6. A Novel Approach To Compressing Sparse Data Tensors

COMMENTS

  1. A survey on data compression techniques: From the perspective of data

    A research study on image compression algorithms are presented in (Rehman et al., 2014). This paper reviewed most of the image compression techniques and a comparison is also done based on underlying techniques, features, merits, demerits, applications and performance results. ... Data Compression: The Complete Reference ...

  2. (PDF) A review of data compression techniques

    Generally, data compression technique is divided into lossy compression and lossless compression (Sharma and Gupta 2017). Among this, lossless compression technique is often used to perform a data ...

  3. Data Compression: The Complete Reference

    This fourth edition of Data Compression provides an all-inclusive, thoroughly updated, and user-friendly reference for the many different types and methods of compression (especially audio compression, an area in which many new topics covered in this revised edition appear). Among the important features of the book are a detailed and helpful ...

  4. [2202.06533] An Introduction to Neural Data Compression

    Neural compression is the application of neural networks and other machine learning methods to data compression. Recent advances in statistical machine learning have opened up new possibilities for data compression, allowing compression algorithms to be learned end-to-end from data using powerful generative models such as normalizing flows, variational autoencoders, diffusion probabilistic ...

  5. Data Compression Algorithms and their Applications

    We invite you to submit high quality papers to this Special Issue on "Data compression and applications", with subjects covering the whole range from theory to applications. The following is a (non-exhaustive) list of topics of interests: Loss-less data compression; Lossy data compression; Algorithms on compressed data; Compressed data ...

  6. A Study on Data Compression Algorithms for Its Efficiency ...

    This research paper focuses on data compression using logical truth tables where two bits of data can be represented using a single bit in both wired and wireless networks. The paper gives a brief introduction on what is data compression and mentions the conventional data compression method, lossless and lossy data compression techniques [ 9 ].

  7. Sensors

    Point cloud data compression has become a prominent topic in several research topics, ranging from Virtual Reality (VR) applications featuring dense point clouds, to automotive scenarios where the point clouds (generated by a LiDAR sensor) are usually sparse and cover a wider 3D area. Within the scope of this article, this survey only addresses ...

  8. Statistical Methods for Data Compression

    An Introduction to Compression. The concept of compression is straightforward when looking at it from a high-level perspective. We want to transform some arbitrary piece of data so that it takes up less space. The less space this data occupies, the more space we will have for other data later. We call this transformation for compression.

  9. 1 An Introduction to Neural Data Compression

    compression), the research tends to focus on generating realistic data [10] or achieving high data log-density [8], objectives not always aligned with data compression. Arguably the first work exploring deep generative mod-els for data compression appeared in 2016 [11], and the topic of neural compression has grown considerably since then.

  10. Introduction to Data Compression

    Data compression techniques and technology are ever-evolving with new applications in image, speech, text, audio, and video. The third edition includes all the cutting edge updates the reader will need during the work day and in class. ... Dive into the research topics of 'Introduction to Data Compression'. Together they form a unique fingerprint.

  11. Lossless data compression techniques and their performance

    This paper provides the study of various lossless data compression techniques and compares their performance and efficiency using time and space complexity. Data compression is itself a huge field in a computer science and it is used in reducing the size of data by removing the redundant characters or encoding the bits in data. We have used only lossless data compression techniques for ...

  12. data compression Latest Research Papers

    The three major steps of genomic data compression are extraction of data, storage of data, and retrieval of the data. Hence, we propose a deep learning based on computational optimization techniques which will be efficient in all the three stages of data compression. Download Full-text.

  13. EE 274: Data Compression, Theory and Applications

    Part III: Special topics The third part of the course focuses on providing exposure to the students to advanced theoretical topics and recent research advances in the field of compression. The topics will be decided based on student interest. Some topics of interest are: distributed compression, succinct data structures, computation & random ...

  14. 63097 PDFs

    The resulting stream of raw data at 3.5 TB/s has to be processed with a setof lossy and lossless compression and data reduction techniques to a storage data rate of 90 GB/swhile preserving ...

  15. Data Compression

    A framework for accelerating bottlenecks in GPU execution with assist warps. N. Vijaykumar, ... O. Mutlu, in Advances in GPU Research and Practice, 2017 5.3 Realizing Data Compression. Supporting data compression requires additional support from the main MC and the runtime system, as we describe here.. 5.3.1 Initial setup and profiling. Data compression with CABA requires a one-time data setup ...

  16. Data Compression

    This paper surveys a variety of data compression methods spanning almost forty years of research, from the work of Shannon, Fano and Huffman in the late 40's to a technique developed in 1986. ... Much of the available literature on data compression approaches the topic from the point of view of data transmission. As noted earlier, data ...

  17. PDF Data compression

    Data compression is an active research area in computer science. By 'compressing data', we actually mean deriving techniques or, more specifically, designing efficient algorithms to: †represent data in a less redundant fashion. †remove the redundancy in data. †implement coding, including both encoding and decoding.

  18. Data Compression

    Compression is commonly implemented in several locations, including databases, email, operating systems, tape drives, network routers, and compression appliances, to help reduce your data footprint. 11.2.2.5.1 Compression Implementation Approaches to data compression vary in time delay or impact on application performance as well as in the ...

  19. PDF Data Compression

    Data Compression Debra A. er Lelew and Daniel S. b herg Hirsc Abstract This pap er eys surv a y ariet v of data compression metho ds spanning almost y fort ears y of h, researc from the ork w Shannon, ano F and Hu man in late 40's to a hnique tec elop deved in 1986. The aim of data compression is to reduce redundancy stored or ated unic comm ...

  20. A Research Paper on Lossless Data Compression Techniques

    Data compression has important application in the area of file storage and distributed system. Data compression is used in multimedia field, text documents, and database table. Data Compression techniques can be classified in two major forms: Lossy Techniques & Lossless Techniques.

  21. Data compression techniques for stock market prediction

    This paper presents advanced data compression techniques for predicting stock markets behavior under widely accepted market models in finance. Our techniques are applicable to technical analysis, portfolio theory, and nonlinear market models. We find that lossy and lossless compression techniques are well suited for predicting stock prices as ...

  22. Data Compression Conference

    The most cited publications primarily focus on research topics in Data compression, Algorithm, Theoretical computer science, Artificial intelligence and Lossless compression. While work presented in the published articles provide substantial information on Data compression, it also covers topics in Lossy compression, Image compression, Data ...

  23. How technology is reinventing K-12 education

    In 2023 K-12 schools experienced a rise in cyberattacks, underscoring the need to implement strong systems to safeguard student data. Technology is "requiring people to check their assumptions ...

  24. Research Summary: Social Determinants of Health

    Social and community context focuses on how the characteristics of environments where people live, learn, work, and play affect their health and well-being. It covers topics like community cohesion, civic participation, discrimination, racism, xenophobia, cultural norms, interpersonal violence, workplace conditions, and incarceration.

  25. An Improved Data Compression Framework for Wireless Sensor ...

    Data compression is crucial in the networks as there is limited energy which is accessible to sensor nodes in wireless sensor networks (WSNs). The sensor nodes lifetime is extended enormously by reducing the data reception and transmission. We introduce a stacked convolutional RBM auto-encoder (stacked CAE) model for compressing sensor data, which is made up of layers: an encode layer and a ...

  26. Table 8

    These SDOH beta data files are curated from existing federal datasets and other publicly available data sources. The files make it easier to find a range of well-documented, readily linkable SDOH variables across domains without having to access multiple source files, facilitating SDOH research and analysis.

  27. Methodology

    The American Trends Panel (ATP), created by Pew Research Center, is a nationally representative panel of randomly selected U.S. adults. Panelists participate via self-administered web surveys. Panelists who do not have internet access at home are provided with a tablet and wireless internet connection. Interviews are conducted in both English ...

  28. Global Biogas Compression Market Outlook 2024-2037,

    Key Topics Covered: 1. An Outline of the Global Biogas Compression Market 1.1. Market Definition 1.2. Market Segmentation 1.3. Product Overview 2. Assumptions and Abbreviations 3. Research ...

  29. Lyme Disease Surveillance and Data

    Over 63,000 cases of Lyme disease were reported to CDC by state health departments and the District of Columbia in 2022. This number reflects cases reported through routine national surveillance, which is only one way public health officials track diseases. Recent estimates using other methods suggest that approximately 476,000 people may be ...

  30. How Latinas See Their Current and Future Situation and What Data Shows

    Pew Research Center's bilingual 2023 National Survey of Latinos - conducted Nov. 6-19, 2023, among 5,078 Hispanic adults, including 2,600 Hispanic women - explores what it's like to be a Latina in the U.S. today. This report uses findings from our 2023 survey as well as demographic and economic data from the Current Population Survey.