{"id":14100,"date":"2020-09-24T13:37:44","date_gmt":"2020-09-24T08:07:44","guid":{"rendered":"https:\/\/www.mygreatlearning.com\/blog\/linear-discriminant-analysis-or-lda\/"},"modified":"2024-10-15T00:11:14","modified_gmt":"2024-10-14T18:41:14","slug":"linear-discriminant-analysis-or-lda","status":"publish","type":"post","link":"https:\/\/www.mygreatlearning.com\/blog\/linear-discriminant-analysis-or-lda\/","title":{"rendered":"Linear Discriminant Analysis or LDA in Python"},"content":{"rendered":"\n<p>Linear discriminant analysis is <a href=\"https:\/\/www.mygreatlearning.com\/academy\/learn-for-free\/courses\/supervised-machine-learning-with-logistic-regression-and-naive-bayes\" target=\"_blank\" rel=\"noreferrer noopener\">supervised machine learning<\/a>, the technique used to find a&nbsp;linear combination of features that separates two or more classes of objects or events.&nbsp;<\/p>\n\n\n\n<p>Linear discriminant analysis, also known as LDA,&nbsp; does the separation by computing the directions (\u201clinear discriminants\u201d) that represent the axis that enhances the separation between multiple classes. Also, <a href=\"https:\/\/www.mygreatlearning.com\/academy\/learn-for-free\/courses\/linear-discriminant-analysis-applications\" target=\"_blank\" rel=\"noreferrer noopener\">Linear Discriminant Analysis Applications<\/a> help you to solve Dimensional Reduction for Data with free Linear Discriminant Analysis Applications.<\/p>\n\n\n\n<ol><li><a href=\"#reducedimensions\">Why do we need to reduce dimensions in a data set?<\/a><\/li><li><a href=\"#highdimensionaldataset\">How to deal with a high dimensional dataset?<\/a><\/li><li><a href=\"#assumptionsoflda\">Assumptions of LDA<\/a><\/li><li><a href=\"#howldaworks\">How LDA works<\/a><\/li><li><a href=\"#preparelda\">How to Prepare the data for LDA<\/a>  <\/li><li><a href=\"#pythonimplementationoflda\">Python implementation of LDA from scratch<\/a><\/li><li><a href=\"#ldaimplementingscikitlearn\">Linear Discriminant Analysis implementation leveraging&nbsp; scikit-learn library<\/a><\/li><\/ol>\n\n\n\n<p>Like <a href=\"https:\/\/www.mygreatlearning.com\/academy\/learn-for-free\/courses\/logistic-regression-algorithm\">logistic Regression<\/a>, LDA to is a linear classification technique, with the following&nbsp; additional capabilities in comparison to logistic regression.<\/p>\n\n\n\n<p>1.&nbsp;&nbsp;&nbsp;&nbsp; LDA can be applied to two or more than two-class classification problems.<\/p>\n\n\n\n<p>2.&nbsp;&nbsp;&nbsp;&nbsp; Unlike Logistic Regression, LDA works better when classes are well separated.<\/p>\n\n\n\n<p>3.&nbsp;&nbsp;&nbsp;&nbsp; LDA works relatively well in comparison to Logistic Regression when we have few examples.<\/p>\n\n\n\n<p>LDA is also a dimensionality reduction technique. As the name implies dimensionality reduction techniques reduce the number of dimensions (i.e. variables or dimensions or features) in a dataset while retaining as much information as possible.&nbsp;<\/p>\n\n\n\n    <div class=\"courses-cta-container\">\n        <div class=\"courses-cta-card\">\n            <div class=\"courses-cta-header\">\n                <div class=\"courses-learn-icon\"><\/div>\n                <span class=\"courses-learn-text\">Academy Pro<\/span>\n            <\/div>\n            <p class=\"courses-cta-title\">\n                <a href=\"https:\/\/www.mygreatlearning.com\/academy\/premium\/master-python-programming\" class=\"courses-cta-title-link\">Python Programming Course<\/a>\n            <\/p>\n            <p class=\"courses-cta-description\">In this course, you will learn the fundamentals of Python: from basic syntax to mastering data structures, loops, and functions. You will also explore OOP concepts and objects to build robust programs.<\/p>\n            <div class=\"courses-cta-stats\">\n                <div class=\"courses-stat-item\">\n                    <div class=\"courses-stat-icon courses-user-icon\"><\/div>\n                    <span>11.5 Hrs<\/span>\n                <\/div>\n                <div class=\"courses-stat-item\">\n                    <div class=\"courses-stat-icon courses-star-icon\"><\/div>\n                    <span>51 Coding Exercises<\/span>\n                <\/div>\n            <\/div>\n            <a href=\"https:\/\/www.mygreatlearning.com\/academy\/premium\/master-python-programming\" class=\"courses-cta-button\">\n                Start Free Trial\n                <div class=\"courses-arrow-icon\"><\/div>\n            <\/a>\n        <\/div>\n    <\/div>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"why-do-we-need-to-reduce-dimensions-in-a-data-set\"><strong>Why do we need to reduce dimensions in a data set?<\/strong><\/h2>\n\n\n\n<p>For some time let\u2019s assume that the world in which we live has one dimension. Finding something in this one-dimensional world is like you start searching for it from one end and head towards&nbsp;another end. You continue till you find the object you started looking for.<\/p>\n\n\n\n<p>In the image below the line represents the dimension and the red circle is representing the object.<\/p>\n\n\n\n<p>Now, if we add another dimension, then it becomes two dimensional.&nbsp; If we attempt to find something in it, relatively it is more complex than it was earlier (one dimensional). Refer to the image below. It helps in explaining the relative complexity it has introduced with the introduction of a new dimension.<br><\/p>\n\n\n\n<p>In a nutshell, finding something in a smaller dimension is relatively easy in comparison to doing the same in a higher dimension. This could be understood with the help of the phenomenon called&nbsp;\u201c<strong>The&nbsp;curse of dimensionality\u201d.<\/strong><\/p>\n\n\n\n<p>Domains like numerical analysis, statistical sampling, machine learning, data mining and modern databases have a common problem,&nbsp; the increase in dimensionality increases the volume of data.<\/p>\n\n\n\n<p>This further leads to the sparsity of the data and it is a problem for any method involving statistical significance.&nbsp;<\/p>\n\n\n\n<p>In the machine learning landscape, a dataset involving high-dimensional feature space, each feature containing a range of values, typically humongous data is required to undermine the hidden complex patterns within the dataset.<\/p>\n\n\n\n<p>Also Read: <a href=\"https:\/\/www.mygreatlearning.com\/blog\/linear-regression-in-machine-learning\/\" target=\"_blank\" rel=\"noreferrer noopener\" aria-label=\"Linear Regression in Machine Learning (opens in a new tab)\">Linear Regression in Machine Learning<\/a><\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"how-to-deal-with-a-high-dimensional-dataset\"><strong>How to deal with a high dimensional dataset?<\/strong><\/h2>\n\n\n\n<p>There are several ways to deal with high dimensional data, below are few commonly used techniques:<\/p>\n\n\n\n<p><strong>Feature extraction<\/strong>&nbsp; <\/p>\n\n\n\n<p>Feature extraction or feature selection is greatly used in fields of statistical studies and machine learning. Deciding on a&nbsp;feature to be extracted requires a great amount of understanding of the domain and prior knowledge of the subject under consideration. For example, in the field of computer vision imagine that we have a 100X100 pixel image. Then the raw vector intensities become 10000. Often the image corners do not contain much useful information. Dimensionality could be reduced significantly if we compromise on a small amount of information loss by retaining the image pixel located at the central position and dropping the pixels at the corners.<\/p>\n\n\n\n<p><strong>Dimensionality reduction analysis <\/strong><\/p>\n\n\n\n<p>Statisticians have evolved the<strong> methods to <\/strong>automatically reduce the dimensionality. Different methods lead to different reduction results.&nbsp;<\/p>\n\n\n\n<p>1.&nbsp;&nbsp;&nbsp;&nbsp; <strong>Principal component analysis<\/strong>: The goal of PCA is to reduce the original high dimensional data to low dimensional space, without losing much vital information. Then features(dimensions) with the largest variance within classes are kept. Redundant and correlated features are dropped.&nbsp;<\/p>\n\n\n\n<p><strong>2.<\/strong>&nbsp;&nbsp;&nbsp;&nbsp; <strong>Linear discriminant analysis: <\/strong>The goal of LDA is to discriminate different classes in low dimensional space by retaining the components containing feature values that have the best separation across classes.&nbsp;<\/p>\n\n\n\n<p>Refer below image having visual depiction of the underlying difference between two techniques:<\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"assumptions-of-lda\"><strong>Assumptions of LDA<\/strong><\/h2>\n\n\n\n<p>LDA assumes:<\/p>\n\n\n\n<ol class=\"wp-block-list\">\n<li>Each feature (variable or dimension or attribute) in the dataset is a gaussian distribution. In other words, each feature in the dataset is shaped like a bell-shaped curve.<\/li>\n<\/ol>\n\n\n\n<p>2. Each feature has the same variance, the value of each feature varies around the mean with the same amount on average.<\/p>\n\n\n\n<p>3. Each feature is assumed to be randomly sampled.<\/p>\n\n\n\n<p>4. Lack of multicollinearity in independent features. Increase in correlations between independent features and the power of prediction decreases. <\/p>\n\n\n\n<p>Also Read: <a href=\"https:\/\/www.mygreatlearning.com\/blog\/understanding-eda-in-python\/\" target=\"_blank\" rel=\"noreferrer noopener\" aria-label=\"Understanding EDA in Python (opens in a new tab)\">Understanding EDA in Python<\/a><\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"how-lda-works\"><strong>How LDA works<\/strong><\/h2>\n\n\n\n<p>LDA projects features from higher dimension to lower dimension space, how LDA achieves this, let\u2019s look into:<\/p>\n\n\n\n<p><em>Step#1 Computes mean vectors of each class of dependent variable<\/em><\/p>\n\n\n\n<p><em>Step#2 Computers with-in class and between-class scatter matrices<\/em><\/p>\n\n\n\n<p><em>Step#3 Computes eigenvalues and eigenvector for SW(Scatter matrix within class) and SB (scatter matrix between class)<\/em><\/p>\n\n\n\n<p><em>Step#4<\/em> <em>Sorts the eigenvalues in descending order and select the top k<\/em><\/p>\n\n\n\n<p><em>Step#5 Creates a new matrix containing eigenvectors that map to the k eigenvalues<\/em><\/p>\n\n\n\n<p><em>Step#6 Obtains the new features (i.e. linear discriminants) by taking the dot product of the data and the matrix.&nbsp;<\/em><\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"how-to-prepare-the-data-for-lda\"><strong>How to Prepare the data for LDA<\/strong><\/h2>\n\n\n\n<p>Machine learning model performance is greatly dependent upon how well we pre-process data. Let\u2019s see how to prepare our data before we apply LDA:<\/p>\n\n\n\n<p><em>1.<\/em>&nbsp;<strong><em>Outlier Treatment<\/em><\/strong><em>: Outliers from the data should be removed, outliers will introduce skewness and in-turn computations of mean and variance will be influenced and finally, that will have an impact on LDA computations.<\/em><\/p>\n\n\n\n<p><em>2.<\/em> <strong><em>Equal Variance<\/em><\/strong><em>: Standardization of input data, such that it has a mean 0 and a standard deviation of 1.<\/em><\/p>\n\n\n\n<p><em>3.<\/em><strong><em>Gaussian distribution<\/em><\/strong><em>: Univariate analysis of each input feature and if they do not exhibit the gaussian distribution transform them to look like Gaussian distribution(log and root for exponential distributions).<\/em> Check out free course on <a href=\"https:\/\/www.mygreatlearning.com\/academy\/learn-for-free\/courses\/multiple-variate-analysis\" target=\"_blank\" rel=\"noreferrer noopener\">multiple variate analysis<\/a><\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"python-implementation-of-lda-from-scratch\"><strong>Python implementation of LDA from scratch<\/strong><\/h2>\n\n\n\n<p>We will be using Wine data available at the scikit-learn website for our analysis and model building.<\/p>\n\n\n\n<p><strong>Step#1<\/strong> Importing required libraries in our Jupyter notebook<\/p>\n\n\n\n<p><strong>Step#2<\/strong> Loading the dataset and separating the dependent variable and independent variable in variables named as <strong>\u201cdependentVaraible<\/strong>\u201d and \u201c<strong>independentVariables<\/strong>\u201d respectively<br><\/p>\n\n\n\n<p><strong>Step#3<\/strong> Let\u2019s have a quick look at our independentVariables.<br><\/p>\n\n\n\n<p><strong>Step#4<\/strong> Let\u2019s have a quick look at&nbsp; details of <strong>i<\/strong>ndependentVariables<strong> <\/strong>in terms of number of rows and columns it contains.<\/p>\n\n\n\n<p>We have 178 records captured against 13 attributes.<\/p>\n\n\n\n<p><strong>Step#5<\/strong> Let\u2019s have a quick look at&nbsp; details of indepdentVariables<\/p>\n\n\n\n<p>We have 13 columns in the dataset. We could infer that:<\/p>\n\n\n\n<ol class=\"wp-block-list\">\n<li><em>All the features and numeric type and having their respective values in float<\/em><\/li>\n\n\n\n<li><em>All the 13 features have 178 records and that is enough to conclude that we don\u2019t have any missing values in any of the independent features.<\/em><\/li>\n<\/ol>\n\n\n\n<p><strong>Step#6<\/strong> Let\u2019s look into the target variable&nbsp;<\/p>\n\n\n\n<p>We can see that our dependentVaraible has got three classes, \u2018class_0' ,'class_1'&nbsp; and 'class_2'.We have three class classification problems.<\/p>\n\n\n\n<p><strong>Step#7<\/strong> Let\u2019s now create data frame having dependent variable and independent variables together<\/p>\n\n\n\n<p><strong>Step#8<\/strong> Let\u2019s start the fun thing, let\u2019s create feature vector for every class and store it in a variable named \u201cbetween_class_feature_means\u201d<\/p>\n\n\n\n<p><strong>Step#9<\/strong> Now, let\u2019s&nbsp; plug the mean mu&nbsp; into the \u201cbetween_class_feature_means\u201d to get&nbsp; within the class scatter matrix.&nbsp;&nbsp;<\/p>\n\n\n\n<p><strong>Step#10<\/strong> Now, let\u2019s&nbsp; try to get the &nbsp;linear discriminants value by computing&nbsp;<\/p>\n\n\n\n<p><strong>Step#11 <\/strong>The eigenvectors with the highest eigenvalues carry the most information about the distribution of the data. Now, since we have got the eigenvalues and eigenvector, let\u2019s sort the eigenvalues from highest to lowest and select the first k eigenvectors.<\/p>\n\n\n\n<p>In order to ensure that the eigenvalue maps to the same eigenvector after sorting, we place them in a temporary array.<\/p>\n\n\n\n<p><strong>Step#12 <\/strong>By just looking at the values obtained above it is difficult to determine variance explained by each component. Thus, let\u2019s express it as a percentage.<\/p>\n\n\n\n<p><strong>Step#13 <\/strong>Now, let\u2019s formulate our linear function for the new feature space.<\/p>\n\n\n\n<p>\t\ty=X.W<\/p>\n\n\n\n<p>where X is a n\u00d7d matrix with n samples and d dimensions, and \u201cy\u201d is a n\u00d7k matrix with n samples and k ( k&lt;n) dimensions. In other words, Y is composed of the LDA components (the new feature space).<\/p>\n\n\n\n<p><strong>Step#14 <\/strong>Since, matplotlib cannot handle categorical variables directly and hence let\u2019s encode them. Every class type will now be represented as a number.&nbsp;<\/p>\n\n\n\n<p><strong>Step#15 <\/strong>Final, let\u2019s now plot the data as a function of the two LDA components<\/p>\n\n\n\n<p>In the above plot we can clearly see the separation between all the three classes.&nbsp; Kudos, LDA has done its job, class got linearly separated.<\/p>\n\n\n\n<p>From step#8 to 15, we just saw how we can implement linear discriminant analysis in step by step manner. Hopefully, this is helpful for all the readers to understand the nitty-gritty of LDA.&nbsp;<\/p>\n\n\n\n<p>We can directly arrive to Step#15, by leveraging the offering of <a href=\"https:\/\/www.mygreatlearning.com\/academy\/learn-for-free\/courses\/scikit-learn\" target=\"_blank\" rel=\"noreferrer noopener\">Scikit-Learn<\/a> library. Let\u2019s look into this.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\" id=\"linear-discriminant-analysis-implementation-leveraging-scikit-learn-library\"><strong><em>Linear Discriminant Analysis implementation leveraging&nbsp;scikit-learn library<\/em><\/strong><\/h2>\n\n\n\n<p>We already have our \u201cdependentVariable\u201d and \u201cindependentVariables\u201d defined, let\u2019s use them to get linear discriminants.<br><\/p>\n\n\n\n<p>Let\u2019s call \u201cexplained_variance_ratio_\u201d on our sklearn model definition of Linear Discriminant Analysis<\/p>\n\n\n\n<p>From above output we could see that the LDA#1 covers 68.74% of total variance and LDA#2 covers 31.2% of total remaining variance.<\/p>\n\n\n\n<p>Now, let\u2019s visualize the output of Sklearn implementation-<\/p>\n\n\n\n<p>The results of both implementations are very much alike.<\/p>\n\n\n\n<p>Finally, let\u2019s now use RandomForest for classification model building, any other classifier can also be used.&nbsp;<\/p>\n\n\n\n<p>We are using Random Forest for now.<br><\/p>\n\n\n\n<p>Let\u2019s check our model performance by printing the confusion matrix<br><\/p>\n\n\n\n<p>We could see from the above confusion matrix, results are absolutely flawless.&nbsp;<\/p>\n\n\n\n<p>Just to make you realize the magic of LDA on classification, we will again run the test and train split, but here our independent variable component will be \u201cindependentVariables\u201d itself instead of using LDA as we did above.<\/p>\n\n\n\n<p>We have defined new variables (<em>X_train1, X_test1, y_train1, y_test1<\/em>) holding our test and train data for our dependent and independent variables.&nbsp; Please note we have not use \u201cX_lda\u201d but \u201cindependentVariables\u201d.<\/p>\n\n\n\n<p>Let\u2019s check the confusion matrix for this new predictions(without linear discriminants)<br><\/p>\n\n\n\n<p>Above confusion matrix shows one misclassification for Class_2. This shows LDA has certainly added value to the whole exercise.<\/p>\n\n\n\n<p>With this, we have come to an end of our long journey on linear discriminant analysis. We build LDA on the Wine dataset step by step. We started with 13 independent variables and then converged to 2 linear discriminant. Using those two linear discriminant we called Random Forest classifier classifies our test data.<\/p>\n\n\n\n<p>If you found this interesting and wish to learn more, check out this <a aria-label=\"Python Tutorial for beginners (opens in a new tab)\" href=\"https:\/\/www.mygreatlearning.com\/blog\/python-tutorial-for-beginners-a-complete-guide\/\" target=\"_blank\" rel=\"noreferrer noopener\">Python Tutorial for beginners<\/a> and take a look at <a href=\"https:\/\/www.mygreatlearning.com\/academy\/learn-for-free\/courses\/linear-programming-examples\" target=\"_blank\" rel=\"noreferrer noopener\">Linear Programming Examples<\/a> as well. <\/p>\n","protected":false},"excerpt":{"rendered":"<p>Linear discriminant analysis is supervised machine learning, the technique used to find a&nbsp;linear combination of features that separates two or more classes of objects or events.&nbsp; Linear discriminant analysis, also known as LDA,&nbsp; does the separation by computing the directions (\u201clinear discriminants\u201d) that represent the axis that enhances the separation between multiple classes. Also, Linear [&hellip;]<\/p>\n","protected":false},"author":41,"featured_media":14103,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"_uag_custom_page_level_css":"","site-sidebar-layout":"default","site-content-layout":"","ast-site-content-layout":"default","site-content-style":"default","site-sidebar-style":"default","ast-global-header-display":"","ast-banner-title-visibility":"","ast-main-header-display":"","ast-hfb-above-header-display":"","ast-hfb-below-header-display":"","ast-hfb-mobile-header-display":"","site-post-title":"","ast-breadcrumbs-content":"","ast-featured-img":"","footer-sml-layout":"","ast-disable-related-posts":"","theme-transparent-header-meta":"","adv-header-id-meta":"","stick-header-meta":"","header-above-stick-meta":"","header-main-stick-meta":"","header-below-stick-meta":"","astra-migrate-meta-layouts":"set","ast-page-background-enabled":"default","ast-page-background-meta":{"desktop":{"background-color":"var(--ast-global-color-4)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"ast-content-background-meta":{"desktop":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"footnotes":""},"categories":[2],"tags":[36799],"content_type":[36249],"class_list":["post-14100","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-artificial-intelligence","tag-machine-learning","content_type-interview-questions"],"acf":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO Premium plugin v27.3 (Yoast SEO v27.3) - https:\/\/yoast.com\/product\/yoast-seo-premium-wordpress\/ -->\n<title>What is LDA (Linear Discriminant Analysis) in Python<\/title>\n<meta name=\"description\" content=\"Linear Discriminant Analysis: Learn about how we build LDA on the Wine dataset step by step and gain an in-depth understanding of linear discriminant analysis with this tutorial.\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/www.mygreatlearning.com\/blog\/linear-discriminant-analysis-or-lda\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Linear Discriminant Analysis or LDA in Python\" \/>\n<meta property=\"og:description\" content=\"Linear Discriminant Analysis: Learn about how we build LDA on the Wine dataset step by step and gain an in-depth understanding of linear discriminant analysis with this tutorial.\" \/>\n<meta property=\"og:url\" content=\"https:\/\/www.mygreatlearning.com\/blog\/linear-discriminant-analysis-or-lda\/\" \/>\n<meta property=\"og:site_name\" content=\"Great Learning Blog: Free Resources what Matters to shape your Career!\" \/>\n<meta property=\"article:publisher\" content=\"https:\/\/www.facebook.com\/GreatLearningOfficial\/\" \/>\n<meta property=\"article:published_time\" content=\"2020-09-24T08:07:44+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2024-10-14T18:41:14+00:00\" \/>\n<meta property=\"og:image\" content=\"http:\/\/www.mygreatlearning.com\/blog\/wp-content\/uploads\/2020\/04\/iStock-1154268654.jpg\" \/>\n\t<meta property=\"og:image:width\" content=\"1253\" \/>\n\t<meta property=\"og:image:height\" content=\"836\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/jpeg\" \/>\n<meta name=\"author\" content=\"Great Learning Editorial Team\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@https:\/\/twitter.com\/Great_Learning\" \/>\n<meta name=\"twitter:site\" content=\"@Great_Learning\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Great Learning Editorial Team\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"9 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\\\/\\\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/linear-discriminant-analysis-or-lda\\\/#article\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/linear-discriminant-analysis-or-lda\\\/\"},\"author\":{\"name\":\"Great Learning Editorial Team\",\"@id\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/#\\\/schema\\\/person\\\/6f993d1be4c584a335951e836f2656ad\"},\"headline\":\"Linear Discriminant Analysis or LDA in Python\",\"datePublished\":\"2020-09-24T08:07:44+00:00\",\"dateModified\":\"2024-10-14T18:41:14+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/linear-discriminant-analysis-or-lda\\\/\"},\"wordCount\":1889,\"commentCount\":5,\"publisher\":{\"@id\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/#organization\"},\"image\":{\"@id\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/linear-discriminant-analysis-or-lda\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/wp-content\\\/uploads\\\/2020\\\/04\\\/iStock-1154268654.jpg\",\"keywords\":[\"Machine Learning\"],\"articleSection\":[\"AI and Machine Learning\"],\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"CommentAction\",\"name\":\"Comment\",\"target\":[\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/linear-discriminant-analysis-or-lda\\\/#respond\"]}]},{\"@type\":\"WebPage\",\"@id\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/linear-discriminant-analysis-or-lda\\\/\",\"url\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/linear-discriminant-analysis-or-lda\\\/\",\"name\":\"What is LDA (Linear Discriminant Analysis) in Python\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/linear-discriminant-analysis-or-lda\\\/#primaryimage\"},\"image\":{\"@id\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/linear-discriminant-analysis-or-lda\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/wp-content\\\/uploads\\\/2020\\\/04\\\/iStock-1154268654.jpg\",\"datePublished\":\"2020-09-24T08:07:44+00:00\",\"dateModified\":\"2024-10-14T18:41:14+00:00\",\"description\":\"Linear Discriminant Analysis: Learn about how we build LDA on the Wine dataset step by step and gain an in-depth understanding of linear discriminant analysis with this tutorial.\",\"breadcrumb\":{\"@id\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/linear-discriminant-analysis-or-lda\\\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/linear-discriminant-analysis-or-lda\\\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/linear-discriminant-analysis-or-lda\\\/#primaryimage\",\"url\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/wp-content\\\/uploads\\\/2020\\\/04\\\/iStock-1154268654.jpg\",\"contentUrl\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/wp-content\\\/uploads\\\/2020\\\/04\\\/iStock-1154268654.jpg\",\"width\":1253,\"height\":836,\"caption\":\"Frontend Languages\"},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/linear-discriminant-analysis-or-lda\\\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Blog\",\"item\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"AI and Machine Learning\",\"item\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/artificial-intelligence\\\/\"},{\"@type\":\"ListItem\",\"position\":3,\"name\":\"Linear Discriminant Analysis or LDA in Python\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/#website\",\"url\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/\",\"name\":\"Great Learning Blog\",\"description\":\"Learn, Upskill &amp; Career Development Guide and Resources\",\"publisher\":{\"@id\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/#organization\"},\"alternateName\":\"Great Learning\",\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Organization\",\"@id\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/#organization\",\"name\":\"Great Learning\",\"url\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/#\\\/schema\\\/logo\\\/image\\\/\",\"url\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/wp-content\\\/uploads\\\/2022\\\/06\\\/GL-Logo.jpg\",\"contentUrl\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/wp-content\\\/uploads\\\/2022\\\/06\\\/GL-Logo.jpg\",\"width\":900,\"height\":900,\"caption\":\"Great Learning\"},\"image\":{\"@id\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/#\\\/schema\\\/logo\\\/image\\\/\"},\"sameAs\":[\"https:\\\/\\\/www.facebook.com\\\/GreatLearningOfficial\\\/\",\"https:\\\/\\\/x.com\\\/Great_Learning\",\"https:\\\/\\\/www.instagram.com\\\/greatlearningofficial\\\/\",\"https:\\\/\\\/www.linkedin.com\\\/school\\\/great-learning\\\/\",\"https:\\\/\\\/in.pinterest.com\\\/greatlearning12\\\/\",\"https:\\\/\\\/www.youtube.com\\\/user\\\/beaconelearning\\\/\"],\"description\":\"Great Learning is a leading global ed-tech company for professional training and higher education. It offers comprehensive, industry-relevant, hands-on learning programs across various business, technology, and interdisciplinary domains driving the digital economy. These programs are developed and offered in collaboration with the world's foremost academic institutions.\",\"email\":\"info@mygreatlearning.com\",\"legalName\":\"Great Learning Education Services Pvt. Ltd\",\"foundingDate\":\"2013-11-29\",\"numberOfEmployees\":{\"@type\":\"QuantitativeValue\",\"minValue\":\"1001\",\"maxValue\":\"5000\"}},{\"@type\":\"Person\",\"@id\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/#\\\/schema\\\/person\\\/6f993d1be4c584a335951e836f2656ad\",\"name\":\"Great Learning Editorial Team\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/wp-content\\\/uploads\\\/2022\\\/02\\\/unnamed.webp\",\"url\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/wp-content\\\/uploads\\\/2022\\\/02\\\/unnamed.webp\",\"contentUrl\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/wp-content\\\/uploads\\\/2022\\\/02\\\/unnamed.webp\",\"caption\":\"Great Learning Editorial Team\"},\"description\":\"The Great Learning Editorial Staff includes a dynamic team of subject matter experts, instructors, and education professionals who combine their deep industry knowledge with innovative teaching methods. Their mission is to provide learners with the skills and insights needed to excel in their careers, whether through upskilling, reskilling, or transitioning into new fields.\",\"sameAs\":[\"https:\\\/\\\/www.mygreatlearning.com\\\/\",\"https:\\\/\\\/in.linkedin.com\\\/school\\\/great-learning\\\/\",\"https:\\\/\\\/x.com\\\/https:\\\/\\\/twitter.com\\\/Great_Learning\",\"https:\\\/\\\/www.youtube.com\\\/channel\\\/UCObs0kLIrDjX2LLSybqNaEA\"],\"award\":[\"Best EdTech Company of the Year 2024\",\"Education Economictimes Outstanding Education\\\/Edtech Solution Provider of the Year 2024\",\"Leading E-learning Platform 2024\"],\"url\":\"https:\\\/\\\/www.mygreatlearning.com\\\/blog\\\/author\\\/greatlearning\\\/\"}]}<\/script>\n<!-- \/ Yoast SEO Premium plugin. -->","yoast_head_json":{"title":"What is LDA (Linear Discriminant Analysis) in Python","description":"Linear Discriminant Analysis: Learn about how we build LDA on the Wine dataset step by step and gain an in-depth understanding of linear discriminant analysis with this tutorial.","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/www.mygreatlearning.com\/blog\/linear-discriminant-analysis-or-lda\/","og_locale":"en_US","og_type":"article","og_title":"Linear Discriminant Analysis or LDA in Python","og_description":"Linear Discriminant Analysis: Learn about how we build LDA on the Wine dataset step by step and gain an in-depth understanding of linear discriminant analysis with this tutorial.","og_url":"https:\/\/www.mygreatlearning.com\/blog\/linear-discriminant-analysis-or-lda\/","og_site_name":"Great Learning Blog: Free Resources what Matters to shape your Career!","article_publisher":"https:\/\/www.facebook.com\/GreatLearningOfficial\/","article_published_time":"2020-09-24T08:07:44+00:00","article_modified_time":"2024-10-14T18:41:14+00:00","og_image":[{"width":1253,"height":836,"url":"http:\/\/www.mygreatlearning.com\/blog\/wp-content\/uploads\/2020\/04\/iStock-1154268654.jpg","type":"image\/jpeg"}],"author":"Great Learning Editorial Team","twitter_card":"summary_large_image","twitter_creator":"@https:\/\/twitter.com\/Great_Learning","twitter_site":"@Great_Learning","twitter_misc":{"Written by":"Great Learning Editorial Team","Est. reading time":"9 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/www.mygreatlearning.com\/blog\/linear-discriminant-analysis-or-lda\/#article","isPartOf":{"@id":"https:\/\/www.mygreatlearning.com\/blog\/linear-discriminant-analysis-or-lda\/"},"author":{"name":"Great Learning Editorial Team","@id":"https:\/\/www.mygreatlearning.com\/blog\/#\/schema\/person\/6f993d1be4c584a335951e836f2656ad"},"headline":"Linear Discriminant Analysis or LDA in Python","datePublished":"2020-09-24T08:07:44+00:00","dateModified":"2024-10-14T18:41:14+00:00","mainEntityOfPage":{"@id":"https:\/\/www.mygreatlearning.com\/blog\/linear-discriminant-analysis-or-lda\/"},"wordCount":1889,"commentCount":5,"publisher":{"@id":"https:\/\/www.mygreatlearning.com\/blog\/#organization"},"image":{"@id":"https:\/\/www.mygreatlearning.com\/blog\/linear-discriminant-analysis-or-lda\/#primaryimage"},"thumbnailUrl":"https:\/\/www.mygreatlearning.com\/blog\/wp-content\/uploads\/2020\/04\/iStock-1154268654.jpg","keywords":["Machine Learning"],"articleSection":["AI and Machine Learning"],"inLanguage":"en-US","potentialAction":[{"@type":"CommentAction","name":"Comment","target":["https:\/\/www.mygreatlearning.com\/blog\/linear-discriminant-analysis-or-lda\/#respond"]}]},{"@type":"WebPage","@id":"https:\/\/www.mygreatlearning.com\/blog\/linear-discriminant-analysis-or-lda\/","url":"https:\/\/www.mygreatlearning.com\/blog\/linear-discriminant-analysis-or-lda\/","name":"What is LDA (Linear Discriminant Analysis) in Python","isPartOf":{"@id":"https:\/\/www.mygreatlearning.com\/blog\/#website"},"primaryImageOfPage":{"@id":"https:\/\/www.mygreatlearning.com\/blog\/linear-discriminant-analysis-or-lda\/#primaryimage"},"image":{"@id":"https:\/\/www.mygreatlearning.com\/blog\/linear-discriminant-analysis-or-lda\/#primaryimage"},"thumbnailUrl":"https:\/\/www.mygreatlearning.com\/blog\/wp-content\/uploads\/2020\/04\/iStock-1154268654.jpg","datePublished":"2020-09-24T08:07:44+00:00","dateModified":"2024-10-14T18:41:14+00:00","description":"Linear Discriminant Analysis: Learn about how we build LDA on the Wine dataset step by step and gain an in-depth understanding of linear discriminant analysis with this tutorial.","breadcrumb":{"@id":"https:\/\/www.mygreatlearning.com\/blog\/linear-discriminant-analysis-or-lda\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/www.mygreatlearning.com\/blog\/linear-discriminant-analysis-or-lda\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.mygreatlearning.com\/blog\/linear-discriminant-analysis-or-lda\/#primaryimage","url":"https:\/\/www.mygreatlearning.com\/blog\/wp-content\/uploads\/2020\/04\/iStock-1154268654.jpg","contentUrl":"https:\/\/www.mygreatlearning.com\/blog\/wp-content\/uploads\/2020\/04\/iStock-1154268654.jpg","width":1253,"height":836,"caption":"Frontend Languages"},{"@type":"BreadcrumbList","@id":"https:\/\/www.mygreatlearning.com\/blog\/linear-discriminant-analysis-or-lda\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Blog","item":"https:\/\/www.mygreatlearning.com\/blog\/"},{"@type":"ListItem","position":2,"name":"AI and Machine Learning","item":"https:\/\/www.mygreatlearning.com\/blog\/artificial-intelligence\/"},{"@type":"ListItem","position":3,"name":"Linear Discriminant Analysis or LDA in Python"}]},{"@type":"WebSite","@id":"https:\/\/www.mygreatlearning.com\/blog\/#website","url":"https:\/\/www.mygreatlearning.com\/blog\/","name":"Great Learning Blog","description":"Learn, Upskill &amp; Career Development Guide and Resources","publisher":{"@id":"https:\/\/www.mygreatlearning.com\/blog\/#organization"},"alternateName":"Great Learning","potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/www.mygreatlearning.com\/blog\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/www.mygreatlearning.com\/blog\/#organization","name":"Great Learning","url":"https:\/\/www.mygreatlearning.com\/blog\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.mygreatlearning.com\/blog\/#\/schema\/logo\/image\/","url":"https:\/\/www.mygreatlearning.com\/blog\/wp-content\/uploads\/2022\/06\/GL-Logo.jpg","contentUrl":"https:\/\/www.mygreatlearning.com\/blog\/wp-content\/uploads\/2022\/06\/GL-Logo.jpg","width":900,"height":900,"caption":"Great Learning"},"image":{"@id":"https:\/\/www.mygreatlearning.com\/blog\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/www.facebook.com\/GreatLearningOfficial\/","https:\/\/x.com\/Great_Learning","https:\/\/www.instagram.com\/greatlearningofficial\/","https:\/\/www.linkedin.com\/school\/great-learning\/","https:\/\/in.pinterest.com\/greatlearning12\/","https:\/\/www.youtube.com\/user\/beaconelearning\/"],"description":"Great Learning is a leading global ed-tech company for professional training and higher education. It offers comprehensive, industry-relevant, hands-on learning programs across various business, technology, and interdisciplinary domains driving the digital economy. These programs are developed and offered in collaboration with the world's foremost academic institutions.","email":"info@mygreatlearning.com","legalName":"Great Learning Education Services Pvt. Ltd","foundingDate":"2013-11-29","numberOfEmployees":{"@type":"QuantitativeValue","minValue":"1001","maxValue":"5000"}},{"@type":"Person","@id":"https:\/\/www.mygreatlearning.com\/blog\/#\/schema\/person\/6f993d1be4c584a335951e836f2656ad","name":"Great Learning Editorial Team","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.mygreatlearning.com\/blog\/wp-content\/uploads\/2022\/02\/unnamed.webp","url":"https:\/\/www.mygreatlearning.com\/blog\/wp-content\/uploads\/2022\/02\/unnamed.webp","contentUrl":"https:\/\/www.mygreatlearning.com\/blog\/wp-content\/uploads\/2022\/02\/unnamed.webp","caption":"Great Learning Editorial Team"},"description":"The Great Learning Editorial Staff includes a dynamic team of subject matter experts, instructors, and education professionals who combine their deep industry knowledge with innovative teaching methods. Their mission is to provide learners with the skills and insights needed to excel in their careers, whether through upskilling, reskilling, or transitioning into new fields.","sameAs":["https:\/\/www.mygreatlearning.com\/","https:\/\/in.linkedin.com\/school\/great-learning\/","https:\/\/x.com\/https:\/\/twitter.com\/Great_Learning","https:\/\/www.youtube.com\/channel\/UCObs0kLIrDjX2LLSybqNaEA"],"award":["Best EdTech Company of the Year 2024","Education Economictimes Outstanding Education\/Edtech Solution Provider of the Year 2024","Leading E-learning Platform 2024"],"url":"https:\/\/www.mygreatlearning.com\/blog\/author\/greatlearning\/"}]}},"uagb_featured_image_src":{"full":["https:\/\/www.mygreatlearning.com\/blog\/wp-content\/uploads\/2020\/04\/iStock-1154268654.jpg",1253,836,false],"thumbnail":["https:\/\/www.mygreatlearning.com\/blog\/wp-content\/uploads\/2020\/04\/iStock-1154268654-150x150.jpg",150,150,true],"medium":["https:\/\/www.mygreatlearning.com\/blog\/wp-content\/uploads\/2020\/04\/iStock-1154268654-300x200.jpg",300,200,true],"medium_large":["https:\/\/www.mygreatlearning.com\/blog\/wp-content\/uploads\/2020\/04\/iStock-1154268654-768x512.jpg",768,512,true],"large":["https:\/\/www.mygreatlearning.com\/blog\/wp-content\/uploads\/2020\/04\/iStock-1154268654-1024x683.jpg",1024,683,true],"1536x1536":["https:\/\/www.mygreatlearning.com\/blog\/wp-content\/uploads\/2020\/04\/iStock-1154268654.jpg",1253,836,false],"2048x2048":["https:\/\/www.mygreatlearning.com\/blog\/wp-content\/uploads\/2020\/04\/iStock-1154268654.jpg",1253,836,false],"web-stories-poster-portrait":["https:\/\/www.mygreatlearning.com\/blog\/wp-content\/uploads\/2020\/04\/iStock-1154268654.jpg",640,427,false],"web-stories-publisher-logo":["https:\/\/www.mygreatlearning.com\/blog\/wp-content\/uploads\/2020\/04\/iStock-1154268654.jpg",96,64,false],"web-stories-thumbnail":["https:\/\/www.mygreatlearning.com\/blog\/wp-content\/uploads\/2020\/04\/iStock-1154268654.jpg",150,100,false]},"uagb_author_info":{"display_name":"Great Learning Editorial Team","author_link":"https:\/\/www.mygreatlearning.com\/blog\/author\/greatlearning\/"},"uagb_comment_info":5,"uagb_excerpt":"Linear discriminant analysis is supervised machine learning, the technique used to find a&nbsp;linear combination of features that separates two or more classes of objects or events.&nbsp; Linear discriminant analysis, also known as LDA,&nbsp; does the separation by computing the directions (\u201clinear discriminants\u201d) that represent the axis that enhances the separation between multiple classes. Also, Linear&hellip;","_links":{"self":[{"href":"https:\/\/www.mygreatlearning.com\/blog\/wp-json\/wp\/v2\/posts\/14100","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.mygreatlearning.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.mygreatlearning.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.mygreatlearning.com\/blog\/wp-json\/wp\/v2\/users\/41"}],"replies":[{"embeddable":true,"href":"https:\/\/www.mygreatlearning.com\/blog\/wp-json\/wp\/v2\/comments?post=14100"}],"version-history":[{"count":17,"href":"https:\/\/www.mygreatlearning.com\/blog\/wp-json\/wp\/v2\/posts\/14100\/revisions"}],"predecessor-version":[{"id":115347,"href":"https:\/\/www.mygreatlearning.com\/blog\/wp-json\/wp\/v2\/posts\/14100\/revisions\/115347"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.mygreatlearning.com\/blog\/wp-json\/wp\/v2\/media\/14103"}],"wp:attachment":[{"href":"https:\/\/www.mygreatlearning.com\/blog\/wp-json\/wp\/v2\/media?parent=14100"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.mygreatlearning.com\/blog\/wp-json\/wp\/v2\/categories?post=14100"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.mygreatlearning.com\/blog\/wp-json\/wp\/v2\/tags?post=14100"},{"taxonomy":"content_type","embeddable":true,"href":"https:\/\/www.mygreatlearning.com\/blog\/wp-json\/wp\/v2\/content_type?post=14100"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}