An excellent and easy explanation to Dynamic Programming by CS Dojo.

CommonLounge

Categories

Message

Follow

Anirudh Thatipelli

CS student interested in Statistics

Active In

Machine Learning

1 reply. 6 discussions. Member

Computer Vision

1 reply. 1 discussion. Member

Competitive Programming

5 replies. Member

Deep Learning Book

1 reply. Member

Deep Learning

1 reply. Member

Algorithms and Data Structures

Member

Web Development

Member

Computer Programming

Member

Artificial Intelligence

Member

U.S. College Admissions for International Students

Member

Featured Contributions

Contributed 100%

2.

tutorial

Estimation

We have learned how to use calculate probability of events of interest. Now, we can use these probabilities to make **inferences( calculated guesses)** of the important characteristics of interest of the entire dataset. The entire collection of elements under investigation is known as the **population. **As it is difficult to investigate each and every element in this huge set, we randomly select a subset of elements, known as **sample.** We can use techniques of **statistical inference** to make these estimations of the entire population by only investigating the smaller sample. Let us solve some questions to get a rough idea of estimation.

Q) Calculate the probability of heads in this given coin toss sequence: **HHTH**?

Ans. By using the classical probability rule, we can say that the probability of getting heads, a.k.a P(H) =

Read more…(571 words)

Category: Machine Learning

Contributed 100%

3.

tutorial

Probability distribution

Let us consider some different real life examples and compute their respective probabilities. Consider this popular game **Spin the bottle:**

Q) What is the probability that the bottle lands at ...

Read more…(634 words)

Category: Machine Learning

Contributed 100%

4.

tutorial

Bayes Theorem

Till now, we have calculated the probability of events based on the empirical( experimental ) evidence or on probabilities of other events. As we had remarked in our earlier tutorials that statistics enables us to tell a story about data, Bayesian inference is another powerful form of such story telling. It helps us to understand how our **belief **about the occurrence of an event can affect the probability of the event.

Generally, **Bayes' theorem **can be defined as:

P(A|B) = \frac{P(B|A) P(A)}{P(B)}

where A and B are two events.

- P(A|B) := likelihood of the occurrence of event A given that event B has occurred.
- P(A) := prior probability of the occurrenc...

Read more…(411 words)

Category: Machine Learning

NaN.

tutorial

Dynamic Programming (continued)by Wiki

Since examples are the best way to go understand dynamic programming, here are three more classic dynamic programming problems. Make sure you either solve the **each** problem or try at least for a few hours before reading the solution.

Read more…(246 words)

Category: International Olympiad in Informatics

comment in this discussion

Anirudh ThatipelliCS student interested in Statistics · 1y

An excellent and easy explanation to Dynamic Programming by CS Dojo.

Read more… (11 words)

Contributed 100%

6.

tutorial

Conditional Probability

In the previous article Probability, I defined what is **probability**. In this following lesson, I will introduce important related concepts like **conditional probability **and **independence.**

We will start by introducing this concept of **independence**

Till now, we have assumed that all events are independent,i.e. occurrence of 1 event will not affect the occurrence of the other event. For example, if we flip a coin twice, the probability of heads occurring twice is 1/2 x 1/2 = 1/4. The first head toss couldn't influence the occurrence of the other and so we consider both of them as independent events.

We can say that the coin doesn't have memory about the first toss

**Formally,** we can define two events A and B to be independent if:

Read more…(636 words)

Category: Machine Learning

NaN.

tutorial

What is Machine Learning? Why Machine Learning?by Wiki

Sometimes we encounter problems which are really har...

Read more…(1500 words)

Category: Machine Learning

comment in this discussion

Anirudh ThatipelliML enthusiast with interest in Statistics · 1y

According to the Deeplearningbook:

Read more… (4 words)

8.

discussion

Probability

Till now, we have studied about statistics and different tools we use to study about it. We recall that** Statistics** is the science of looking at a given data and understanding the story of the data. On the other hand, **Probability **is the study of the story and understanding the data behind it. There is a **complementary relationship existing between the two.**

Read more…(1026 words)

Category: Machine Learning

9.

discussion

Pie Charts

Till now, we have covered bar charts, histograms and scatter plots to study data and make sense out of it. As we have already covered, statistics is the* art of making sense from the data* or we can say understanding the *story of the data.*

All the previous techniques helped us answer certain questions about the data.

**Scatter-plot **answered the question: **Does this data follow any specific pattern like linearity?**.

**Histogram **answered the question: **What is the range of data most frequently represented?**

**Bar chart **answered the question: **What is the bigger picture conveyed by the data points?**

Another question which we would like an answer is: **How is one category of data represented relatively to other?(Hint: Pie Chart )**

Read more…(259 words)

Category: Machine Learning

10.

discussion

Use of Sanskrit in Artificial Intelligence

This is a link to the paper on usage of **Sanskrit **in Artificial Intelligence

**Apologies if this is irrelevant to this community. **

Read more…(22 words)

Category: Machine Learning

11.

discussion

Histograms and Barcharts

In the last lecture, we learned about how useful scatter-plots are when analyzing 2D data. But scatter-plots cannot capture all the essential details of the data. Also, they don't always give the correct answer when there is **noise.**

For example, Consider the given table:

Read more…(396 words)

Category: Machine Learning

Load More