Updated June 9, 2023
What is Markov Logic Network?
A Markov logic network (MLN) L (which is logic) is a set of pairs (f, w), where f represents the formula used in first-order logic and w is the weight assigned and is a real number. When considered with a finite set of constants c= {c1, c2,……, cn}, it completely defines a Markov network.
MLC Equations
- MLC has one binary node for each possible logic for every atom present in L. If the value of the node is 1, then the atom is true, otherwise false.
- MLC has one feature for every possible logic of formula f in L. If logic is True, the value of this feature will be 1, otherwise 0.
Basically, the Markov Logic network combines first-order logic,c, which is nothing but a set of formulas using Boolean variables. One can imagine first-order logic as hard constraints on an environment; if any instance violates any of the constraints or formulae, it has zero probability. This is a hard constraint but in the real-world scenario, we must have some flexibility. So, Markov Network logic comes into the picture to soften the constraints of first-order logic; when in an environment, if, for any instance, any formula is violated, it has less probability rather than zero probability which makes it impossible. In MLN the fewer the constraints are violated higher is the probability of that instance.
In MLN, for each formula, there’s the weight associated with it; the higher the associated weight stronger will be the constraint. Having a higher associated weight reflects the difference in the probability that satisfies the environment and that doesn’t.
In First Order Logic, formulas are created by using 4 types of symbols: variables, constants, predicates, and functions. The symbol constant represents the objects in the domain (objects like names of people: Rahul, Sachin). The symbol variable represents range over the object (The metro cities only like Delhi, Hyderabad, Bengaluru). The symbol-function represents the relationship between objects (mapping like Brother of). And the symbol predicate represents the relationship among all the objects in the environment (Colleagues).
In the Markov Logic network for inference, MCMC (Markov chain Monte-Carlo) is used to answer the queries required for the minimal ground network. Weights assigned to the functions are learned iteratively from the relational databases by optimizing likelihood measures. Techniques of inductive logic programming are used for learning additional clauses. In first-order logic, often knowledge bases are limited, and so we use restricted subsets with more favorable properties. The most widely used out of all available clauses is Horn Clause, which contains most one positive literal.
Where do we use a Markov logic network?
Markov logic network makes it easy for one to combine knowledge base and probabilistic measures. This makes it useful for a number of fields in statistics:
- Social Network modeling
- Link-based clustering
- Collective classification
- Link-based predictions
- Object identification
Markov logic is a high-level language and can easily be applied to certain highly connected Data-types. Markov Logic does a better job of identifying patterns and infer relations that exists in higher-level structure datasets. After detecting this Markov, logic creates almost structure-less tables, which generally is used in traditional machine learning algorithms.
Importance of Markov logic network
As we know, Markov logic networks are nothing but first-order logic that are associated with weights and generate Markov networks. Some important features of Markov Logic Network are:
- Most of the real-world problems can not be explained only by probabilistic theory or only by first-order logic. Markov logic does a great job of combining both to solve real-world problems.
- Markov logic network has this power of combining logical inference and probabilistic inference. Markov logic uses lifted inference algorithms, which are operated at the first order compact level and uses propositionalizing only if necessary. This makes it more scalable and can work on much larger networks.
- Creating a proposal (Sampling) for the distribution and computation of weight from the complex structures is an integral part of the MLN.
- Compressed Markov logic network representation is used for scalable sampling in proportional distribution.
- Where deep learning takes a good amount of coding time, man’s thousands of lines of code, with the help of the Markov logic network, the same could be explained with few logic formulas.
Conclusion
When it comes to dealing with real-world problems, we cannot move forward with only first-order logic or with only probabilistic theory. Markov logic provides that platform which is a way forward where both first-order logic and probabilistic theory can be combined. In this article, we have discussed the high-level view of Markov logic. We have discussed its uses and importance also. Markov logic further could be a new revolution in the field of AI; I would suggest going through the below article on the Next level of Machine learning. Here there are arguments that can make Markov logic best suited for future or next level Machine learning techniques.
Recommended Articles
This is a guide to Markov Logic Network. Here we discuss the Importance of the Markov logic network and where to use it along with the MLC Equations. You may also have a look at the following articles to learn more –