Sustainable AI. Tools for Moving Toward Green AI - Helion

ISBN: 9781098155476
stron: 292, Format: ebook
Data wydania: 2025-10-08
Księgarnia: Helion
Cena książki: 228,65 zł (poprzednio: 265,87 zł)
Oszczędzasz: 14% (-37,22 zł)
In the era of big data and even bigger machine learning models powering the current generative AI revolution, the environmental footprint of these developments can no longer be ignored. This much-needed guide confronts the challenge head-on, offering a groundbreaking exploration into making deep learning (DL) both efficient and accessible. Author Raghavendra Selvan exposes the high costs—both environmental and economic—of traditional DL methods and presents practical solutions that pave the way for a more sustainable AI.
This essential read is for anyone in the machine learning field, from the academic researcher to the industry practitioner, who wants to make a meaningful impact on both their work and the world. This book enables readers to be agents of change toward a more sustainable and inclusive technological future.
- Learn strategies to significantly reduce the energy consumption, carbon footprint, and hardware demands of DL models
- Examine ways to break down barriers and foster a more inclusive future in AI development
- Explore strategies for cutting costs and minimizing ecological impact
- Learn how to balance performance with efficiency in model development
- Gain proficiency in cutting-edge tools that enhance the sustainability of your AI projects
Osoby które kupowały "Sustainable AI. Tools for Moving Toward Green AI", wybierały także:
- Jak zarabia 169,00 zł, (67,60 zł -60%)
- AI w tradingu. Kurs video. Nowoczesne narz 198,98 zł, (79,59 zł -60%)
- React z AI. Kurs video. Programowanie wspomagane sztuczn 119,00 zł, (47,60 zł -60%)
- Sztuczna inteligencja w Azure. Kurs video. Uczenie maszynowe i Azure Machine Learning Service 198,98 zł, (79,59 zł -60%)
- Sztuczna inteligencja w Azure. Kurs video. Us 198,98 zł, (79,59 zł -60%)
Spis treści
Sustainable AI. Tools for Moving Toward Green AI eBook -- spis treści
- Preface
- Who Should Read This Book?
- What This Book Is and Is Not
- Using This Book
- Conventions Used in This Book
- Using Code Examples
- OReilly Online Learning
- How to Contact Us
- Acknowledgments
- 1. Sustainability and Artificial Intelligence
- Scope of Sustainability
- Artificial Intelligence: The New Electricity?
- Sustainability × AI
- AI for Sustainability
- Sustainability of AI
- Energy consumption of AI
- Climate impact of AI
- Sustainability impact of AI
- A Green Path to Sustainable AI
- TL;DR
- 2. Under the Hood of Generative AI
- Representation Learning
- Overview of Representation Spaces
- Learning Representation Spaces
- Learning Representations to GenAI
- Autoencoders
- Unregularized autoencoders
- Regularized autoencoders
- Large Language Models
- Multimodal Generative Models
- Autoencoders
- Tour of Neural Architectures
- Data Modalities
- Neural Network Zoo
- Multilayer perceptrons (Everything is just a vector)
- Convolutional neural networks (Local patterns repeat everywhere)
- Recurrent neural networks (Yesterday affects today)
- Graph neural networks (Relationships, not grids)
- Transformers (Let every token talk to every other token)
- Tokenization makes transformers versatile
- Formalizing Machine Learning
- Nonlinear Models and Deep Learning
- How to Train Your Model
- Gradient descent
- Stochastic gradient descent
- Building GenAI
- GenAI Ingredients
- Resources and Engineering at Scale
- Additional Resources
- Common Notations
- Datasets
- AerialNIST dataset
- FAIRYTALES dataset
- From ML Basics to Sustainable AI
- Representation Learning
- 3. Quantifying the Efficiency of Deep Learning
- AI Waste
- Resource Consumption of Deep Learning
- Resource Efficiency and Climate Awareness
- Actual Carbon Footprint of AI
- Resource Efficiency and Sustainable AI
- Quantifying Resource Consumption of AI
- Model Complexity
- Number of parameters
- Multiply-accumulate
- Floating-point operations
- Efficient matrix multiplications
- Computation Time
- Runtime and latency
- GPU hours
- Energy Consumption
- Energy consumption of AI
- Estimating energy consumption
- Model Complexity
- Carbon Footprint of AI Models
- GHG Emissions and Carbon Footprint
- Relating Carbon Footprint to Energy Consumption
- Estimating the Carbon Footprint of AI Models
- Efficiency Quantified: What Comes Next?
- 4. Data Parsimony
- The Cost of Data
- Carbon Footprint of Data Storage
- Scale of Datasets in AI
- Carbon Footprint of Processing Data
- Dataset Curation
- Active Learning for Dataset Creation
- Learning with Pruned Datasets
- Instance Selection
- Random sampling
- K-means clustering
- K-center selection
- Tokenization and Data Efficiency in Modern AI Models
- Coreset Selection
- Herding
- Importance-based coreset selection
- Instance Selection
- Learning with Compressed Data
- Data Point Compression
- Random projection
- Principal component analysis
- Autoencoders
- Dataset Condensation
- Dataset condensation with performance matching
- Dataset condensation with distribution matching
- Data Point Compression
- Data and Dataset Compressed: What Comes Next?
- The Cost of Data
- 5. Automating Model Selection
- Motivation
- The Model Selection Hierarchy: MC3-Space
- Model Selection as Optimization
- Search space
- Optimization criteria
- Optimization strategy
- Hyperparameter Optimization
- Grid Search
- Random Search
- Bayesian Optimization
- Neural Architecture Search
- NAS Search Space
- NAS As Optimization
- NAS Using Random Search
- NAS Using Evolutionary Algorithms
- Efficiency and NAS
- NAS tabular benchmarks
- NAS benchmarks for surrogate models
- Model Selection in the Era of Foundational Models
- Mixture of Experts
- Model Selection Automated: What Comes Next?
- Motivation
- 6. Training Efficiency
- Training Costs of AI Models
- Transfer Learning
- Pretrained Models
- Fine-Tuning of Pretrained Models
- In-Context Learning in LLMs
- Training Compressed Neural Networks
- Neural Network Pruning
- Factorized Neural Networks
- Low-Rank Adaptation of Foundational Models
- Quantization
- Low-Precision Training
- Quantizing Optimizer States
- Efficient Training Achieved: What Comes Next?
- 7. Lean Inference
- Lifetime Cost of an AI Model
- Achieving Lean Inference
- Resource-Efficient Architectures
- Resource-aware NAS with Pareto optimization
- Knowledge Distillation
- Pruning of Trained Models
- Post-Training Quantization
- Static quantization
- Dynamic quantization
- Resource-Efficient Architectures
- Deploying Models
- Cross-Platform Models
- Inference Beyond Python
- AI Model Inference in Low-Level Languages
- Serving Foundational Models in C++
- Inference Is Lean: What Comes Next?
- 8. Hardware Considerations
- Environmental Cost of AI Hardware
- Embodied Emissions
- E-Waste
- Hardware Scaling Laws of AI
- The Alchemy of Creating AI
- Improving the Resource Efficiency of AI Hardware
- Cluster-Level Optimization
- Green scheduling
- Parallelism in AI workloads
- Model sharding
- Accelerator-Level Optimization
- GPU collocation
- Dynamic voltage frequency scaling
- Custom Hardware Optimization
- Hardware-optimized software
- Neural processing units
- Cluster-Level Optimization
- Hardware Optimized: What Comes Next?
- Environmental Cost of AI Hardware
- 9. A Recipe for Sustainable AI
- Technical Debt of Machine Learning
- Environmental Debt of AI
- Transparency Debt
- Data Debt
- Other Elements of Environmental Debt
- Operationalizing Sustainable AI
- MLOps
- Green MLOps
- Green MLOps in Practice
- Model Cards
- Energy Ratings
- Orchestration Frameworks
- Sustainable AI Operationalized: What Comes Next?
- 10. Toward Sustainable AI
- Rebound Effects and AI
- Efficiency Is Not Enough
- Broader Environmental Effects
- Beyond Efficiency
- Economic Sustainability of AI
- Social Sustainability of AI
- The Way Forward
- Systems Thinking
- Putting Systems Thinking into Practice
- Sustainable AI Systems Assessment framework
- Sustainable AI principles
- Impact of Sustainable AI
- Epilogue
- Index





