<script type="application/ld+json">
{
 "@context": "https://schema.org",
 "@type": "FAQPage",
 "mainEntity": {
   "@type": "Question",
   "name": "30 tech terms you haven't heard about before",
   "acceptedAnswer": {
     "@type": "Answer",
     "text": "1. Technological Singularity
2. BIRCH
3. Situation calculus
4. Decompiler
5. Machine vision
7. Sparse matrix
8. Symbolic interactionism
9. Stepwise function
10. DATR
11. Federated learning
12. Principle of Rationality
13. Intelligence amplification
14. Reification
15. Hyper-heuristic
16. Thompson sampling
17. Computational number theory
18. Admissible heuristic
19. Voice User Interface
20. Batch processing"
   }
 }
}
</script>

Drive to Reimagine

30 tech terms you haven't heard about before

Jeremy DSouza
.
Dec 17
.
9-10 mins

Table of contents

Key takeawaysCollaboration platforms are essential to the new way of workingEmployees prefer engati over emailEmployees play a growing part in software purchasing decisionsThe future of work is collaborativeMethodology

Here’s a list of 30 tech terms you’ve never heard of before (but you’ll wish you did). Ready to dive in?


Technological Singularity

The technological singularity is the theoretical point in the future after which the growth of technology would become uncontrollable and irreversible, causing unpredictable changes to human life and civilizations. The ‘Intelligence Explosion’ is the most popular Singularity theory, according to which an upgradable intelligent agent will eventually get into a "runaway reaction" or a loop of self-improvement cycles, with every iteration being more intelligent than the previous one and evolving at an increasingly rapid pace. This would theoretically cause an ‘explosion’ in intelligence, leading to artificial superintelligence that would go far beyond all human intelligence.

BIRCH

The BIRCH algorithm is used for hierarchical clustering on extremely large datasets. It is an unsupervised data mining algorithm that can even be tweaked to speed up k-means clustering and Gaussian mixture modeling with the expectation-maximization algorithm. 

One of the most significant advantages of BIRCH is that it can be used to incoming, multi-dimensional metric data points in an incremental and dynamic manner to create the best clustering for a specific set of resources (memory and time constraints). 

Situation calculus

Situation calculus is a logic formalism that represents and reasons about dynamical domains. The whole idea is that reachable states could be defined according to the actions that need to be taken for the purpose of reaching those states. You could basically consider situational calculus to be the feature-based representation of actions. 

Decompiler

Decompilers are programs that use executable files as inputs and then try to create high-level source files which can be recompiled. Quite often, decompilers won’t be able to reconstruct the original source code in a perfect manner and will end up creating obfuscated code. But they’re still vital in reverse engineering computer software.

Most decompilers are used to recreate create source code from binary executables, some decompilers even have the capability to transform specific binary data files into human-readable and editable sources.

Machine vision

Machine vision is all about computers having the ability to see things. They make use of video cameras, analog-to-digital conversion (ADC), and digital signal processing (DSP) for this purpose and the data is then sent to a computer or robot controller. 

Sensitivity and resolution are very important specifications in these systems. Sensitivity refers to how well the machine can see in dim light or to detect weak impulses at invisible wavelengths. Resolution is all about the computer’s ability to differentiate between objects. 


Sparse matrix

A sparse matrix is a matrix in which there is a greater number of elements that have zero values than the number of elements that have non-zero values. They are great for computing large-scale applications that dense matrices just can’t handle. 

The types of sparse matrices include:

  • csc_matrix: Compressed Sparse Column format
  • csr_matrix: Compressed Sparse Row format
  • bsr_matrix: Block Sparse Row format
  • lil_matrix: List of Lists format
  • dok_matrix: Dictionary of Keys format
  • coo_matrix: COOrdinate format (aka IJV, triplet format)
  • dia_matrix: DIAgonal format


Symbolic interactionism

Symbolic interactionism is a sociological theory that also has applications in the Natural Language Processing (NLP) sub-domain of artificial intelligence. The symbolic approach to NLP focuses on human-developed rules and lexicons. 

The theory is formed using practical considerations and hints towards specific effects of communication and interaction in people to make images and normal implications, for the purpose of deduction and correspondence with others.

Stepwise function

The stepwise function is also known as the staircase function in mathematics. It is a piecewise constant function, that has only a finite number of pieces. These functions manage an application’s components and logic, making it possible for you to write less code and instead focus on developing and updating your application in a faster manner. 

In a stepwise function, there is a constant value on specified intervals, but there is a different constant for every interval. The constant value on each interval creates the series of horizontal lines, and the fact that the constant is different for each interval causes the jumps in between every horizontal line segment. This causes the graph of a step function to resemble a set of stairs.


DATR

DATR is a language that is used for lexical knowledge representation. It has been implemented in a wide range of programming languages, and several implementations are available on the internet, including an RFC-compliant implementation at the Bielefeld website. It is still being used for the purpose of encoding inheritance networks in several linguistic and non-linguistic domains and is currently under discussion to be a standardized notation that will be used for the representation of lexical information.

Federated learning

Federated learning, aka collaborative learning, is a machine learning technique that involves training an algorithm across several decentralized edge devices or servers holding local data samples, without exchanging them. 

This type of learning makes it possible for multiple actors to develop a common, robust machine learning model without sharing data, which makes it possible to tackle critical issues like data privacy, data security, data access rights, and access to heterogeneous data. It makes continual learning possible on end-user devices while making sure that the end user data does not leave end-user devices.


Principle of Rationality

The Principle of Rationality (Rationality Principle) suggestions that agents act in the most appropriate manner according to the objective situation. It is an idealization conception of human behavior that aided Karl R. Popper (who coined the term) in driving his model of situational analysis.

In artificial intelligence, rational agents are essentially agents whose actions are logical with respect to the information or situation that the agent processed and the goals of the agent (the objective that the agent was designed to achieve).

Rationality is judged on the basis of:

  • The existence of a performance measure that defines the success criterion.
  • The agent having prior knowledge about its environment.
  • The best possible actions for an agent to perform.
  • The sequence of percepts


Intelligence amplification

Intelligence amplification is all about using information technology to augment human intelligence. While AI seeks to give computers human-like intelligence to perform autonomous workflows and act as standalone systems that have the ability to process information and even make decisions, the goal of intelligence amplification (IA) is to complement and amplify human intelligence.

Instead of trying to reinvent the wheel, IA just tries to build on and supplement human intelligence that has been evolving for millions of years. Intelligence amplification is also known as cognitive augmentation or machine augmented intelligence.

Reification

Reification refers to a process in which an abstract idea about a a computer application gets turned into an object, explicit data model or another object created in a programming language. The process of reification allows things that were previously implicit, unexpressed, and possibly even inexpressible to be formulated in an explicit manner and made available for conceptual (logical or computational) manipulation. It is a very popular conceptual analysis and knowledge representation technique.

Hyper-heuristic

Hyper-heuristics are search heuristics that automate the process of selecting, combining, generating, and adapting several simpler heuristics. They are used to solve complex computational search problems that could not be handled by the simpler heuristics on their own.

They are essentially high-level automated search techniques that explore the search space of low-level heuristics or heuristic components to deal with complicated computational search problems. Hyper-heuristics aim to lower the amount of domain knowledge needed in search methods.

Thompson sampling

Thompson sampling is an algorithm that makes use of exploration and exploitation to choose actions that would maximize the rewards earned. This technique is also known as Probability Matching or Posterior Sampling.

The results of the exploration could be rewards or penalties, and they help in figuring out which actions will be carried out to improve future performance.


Computational number theory

Computational number theory is a branch of number theory that is also referred to as algorithmic number theory. It concentrates on detecting and making use of efficient computational techniques and algorithms to solve several problems in number theory as well as arithmetic geometry. It is widely used for primality testing as well as the prime factorization of large integers. It is also popularly used in elliptic curve cryptography, RSA, and post-quantum cryptography.

Admissible heuristic

An admissible heuristic is a heuristic employed to estimate the cost involved in reaching the goal state in a search algorithm. They never overestimate the cost of reaching the goal state. Making use of admissible heuristics also results in optimal solutions, they always find the cheapest path solution.

In order for the heuristic to be admissible to a search problem, it must be lower than or equal to the actual cost of reaching the goal.


Voice User Interface

A voice user interface is a means for a user to interact with and make use of a computer system by using voice commands along with or instead of a touchscreen, trackpad, keyboard, or a mouse. It involves using speech recognition to understand voice commands and answer questions and type text to speech to play a reply.

 

Batch processing

Batch processing is a technique of running high-volume, repetitive data jobs. This method makes it possible to process data when computing resources are available, and with little or no user interaction.

When batch processing is carried out, users collect and store data, and then process the data during an event known as a “batch window.” This increases the efficiency by establishing processing priorities and completing data jobs at a time that makes the most sense.

Abductive Logic Programming (ALP)

Abductive logic programming (ALP) is a high-level knowledge-representation framework. You can use it to solve problems declaratively on the basis of abductive reasoning. 

Abductive reasoning is based on building and testing hypotheses by making use of the best information available. A person’s daily decision making tasks involve making use of abductive reasoning because the person is working with whatever information is available, even if that information is incomplete information.


Kubernetes

Kubernetes is a platform that enables you to manage containerized workloads and services more effectively while facilitating declarative configuration and automation. It supports data center outsourcing to public cloud service providers but is also employed for web hosting at scale. 

In 2014, Google turned it into an open-source project. It’s essentially a compilation of 15+ years of Google’s experience in running production workloads at scale. 


Dynamic time warping

Dynamic time warping is an algorithm in time series analysis that is used to measure the similarity between two temporal sequences, which could vary in speed. Essentially, any data that can be turned into a linear sequence can be analyzed using dynamic time warping. It is very widely used in automatic speech recognition to deal with varying speaking speeds.

Variational autoencoder

Variational autoencoders are autoencoders whose training is regularized to avoid overfitting and ensure that the latent space has good properties that enable generative process. These are generative systems and serve a purpose similar to that of a generative adversarial network.

Variational autoencoders tackle latent space irregularity by causing the encoder to return a distribution over the latent space rather than a single point and by introducing a regularization term to the loss function over that returned distribution to ensure that the latent space is organized in a better manner.


Brute-force search

Brute-force search involves generating a list of all the possible candidates for a solution and then testing the validity of every single candidate. Since it does not require any domain knowledge whatsoever, it is the most common search algorithm. It is also known as exhaustive search or generate and test.


Algorithmic probability

Algorithmic probability is a mathematical method of assigning a prior probability to a given observation. It is also known as Solomonoff probability. The types of algorithmic probability techniques include:

  • An Algorithmic Probability Loss Function
  • Categorical Algorithmic Probability Classification
  • Approximating the Algorithmic Similarity Function


Spatial-temporal Reasoning

Spatial–temporal reasoning is a branch of artificial intelligence that builds on concepts from  computer science, cognitive science, and cognitive psychology. It’s theoretical aim is  representing and reasoning spatial-temporal knowledge in the mind. On the computing side, it aims to develop high-level control systems of automata to navigate and understand time and space.

Lexical-Functional Grammar

Lexical-functional grammar (LFG) is a constraint-based grammar framework in theoretical linguistics. It’s one of the hottest topics in Natural Language Processing (NLP) right now. LFG is essentially the constraint-based philosophy of grammar. The two basic forms of lexical functional grammar are C-structure and F-structure.

Transduction

Transduction in n logic, statistical inference, and machine learning refers to reasoning that is derived from observed, particular (training) cases to specific (test) cases. It is used in the domain of statistical learning to refer to predicting particular examples when specific examples from a domain are provided. Transduction is also known as transductive inference.

Rete Algorithm

The Rete algorithm is a pattern matching algorithm that is used to implement rule-based systems. It was created to apply several rules to or patterns to several objects, or facts, in a knowledge base in an efficient manner. This algorithm is used to figure out which of the system’s rules need to be fired based on its data store, its facts.

Theory of computation

The Theory of computation focuses on the logic of computation with respect to simple machines, referred to as automata. It is a theoretical branch of Computer Science and Mathematics. The whole point of this theory is to develop mathematical and logical models that run efficiently to the point of halting.  

It forms the basis for:

  • Writing efficient algorithms that run in computing devices.
  • Programming language research and their development.
  • Efficient compiler design and construction.


Want to learn more about tech, AI, CX, and eCommerce concepts? Check out the Engati Glossary!


Share
Share

Jeremy DSouza

Jeremy is a marketer at Engati with an interest in marketing psychology and consumer neuroscience. Over the last year he has interviewed many of the world's brightest CX, AI, Marketing, and Tech thought leaders for Engati CX.

Andy is the Co-Founder and CIO of SwissCognitive - The Global AI Hub. He’s also the President of the Swiss IT Leadership Forum.

Andy is a digital enterprise leader and is transforming business strategies keeping the best interests of shareholders, customers, and employees in mind.

Follow him for your daily dose of AI news and thoughts on using AI to improve your business.

Catch our interview with Andy on AI in daily life

Continue Reading

Request a Demo!

Get started on Engati with the help of a personalised demo.

Thanks for the information.
We will be shortly getting in touch with you.
Please enter a valid email address.
For any other query reach out to us on contact@engati.com

Contact Us

Please fill in your details and we will contact you shortly.

Thanks for the information.
We will be shortly getting in touch with you.
Oops! Looks like there is a problem.
Never mind, drop us a mail at contact@engati.com