Page 65 - AI for Good - Impact Report
P. 65

AI for Good



                   Interoperability: The ability of different systems, software applications, or devices to
                   communicate, work together and exchange information seamlessly, without compatibility
                   issues, even if they were developed by different organizations or for different purposes.

                   Large language models (LLMs): A type of AI trained on a large amount of text data designed
                   to understand and generate human-like text.

                   Machine learning (ML): A subset of AI that involves the development of algorithms and statistical
                   models that enable computers to learn from and make predictions or decisions based on data
                   without being explicitly programmed.

                   Natural language processing (NLP): A branch of AI that focuses on the interaction between
                   computers and human language. It involves enabling machines to understand, interpret, and
                   generate human language in a meaningful way.

                   Neural network: A computational model inspired by the human brain, consisting of layers of
                   interconnected nodes (neurons) that process input data to produce an output. Neural networks
                   are the foundation of deep learning.

                   Symbolic AI: An approach to artificial intelligence that uses symbols or concepts, rather than
                   numerical data to represent knowledge and logical rules to manipulate these symbols for
                   reasoning and problem-solving.

                   Training Data: The dataset used to train an AI or machine learning model. The quality and
                   quantity of training data significantly impact the model's performance.
                   Transfer learning: A machine learning technique where a model trained on one task is reused
                   or adapted for a different but related task. It’s particularly useful when there is limited data
                   available for the new task.










































                                                           55
   60   61   62   63   64   65   66   67   68   69   70