General

Data Analytics Revision Sessions II

Written by Isma’il · 2 min read >

In continuation of data analytics revision sessions, in this blog, I would like to discuss the other remaining topics we treated during the sessions. Probability distribution and Bayes theorem were treated in the first session whereas linear programming, correlation, and regression were discussed in the second session.

Linear programming is a mathematical model or technique that provides management with a quantitative solution in their decision-making processes. I could remember during our school days; we treated all sorts of problems under this topic. As an accounting student then, it was part of the requirement, to understand the technique and its usefulness. We were taught two different methods, namely: the Graphical and Simplex methods.

It is well understood that linear programming work under certain assumptions that include the condition of certainty, linearity, additively, divisibility, non-negative variables, finiteness, and optimality. Out of these assumptions I think, the facilitator explained only two of them after the curiosity of some of the students.

  • Non-negativity variables or constraints:

He explained that in Linear Programming (LP), it is assumed that all answers or variables are non-negative. Negative values of physical quantities are impossible situations.

  • Optimality:

In LP problems of maximum profit solution or minimum cost solution always occurs at a corner point of two sets of feasible solution. 

Linear Programming Model:

  • Objective functions: it is used to ascertain the optimal solution in a given linear programming problem. It has the following features:

The value is to be either maximized or minimized subject to the constraints. The objective function is needed to solve the optimization problems. Its linear function usually expresses as:

Z = ax1 + bx2, where a, and b are constraints while x1 and x2 are variables.

  • Constraints: the objective function is subject to the constraints in linear programming problems. Decision variables are used as mathematical symbols representing the levels of activity of a firm. Constraints are restrictions placed on a firm by the operating environment stated in linear relationships of the decision variables. They typically represent resource constraints, or minimum or maximum level of activity.

ax1 + ax2 +……axn  ( <=, =, >=)   b1

ax1 + ax2 +……axn  ( <=, =, >=)   b2

ax1 + ax2 +……axn  ( <=, =, >=)   bn.

Where x1, x2,….x3 is the decision variable. The expressions ( <=, =, >=) mean that each constraint may take any one of the three signs.

  • Non-negativity variables or constraints: this is already explained above. However, the equation can be expressed as:

X1, x2, ………xn >= 0.

Throughout my accounting carrier, we digested plenty of Linear programming (LP) problems.

Nevertheless, what appears to be different with LBS is the use of Excel solver to solve LP problems and generate reports within a short period.

It is great to experience, it was awesome when Dr, Francis introduce Solver to solve LP problems using the simplex method.

We first designed the mathematical model and set up the table structures, where all available resources and the objective value were spelled out in the table. From this we designed the formula of the objective function, subjecting it to the constraint. In other words, set objective by changing variable cells and subject to the constraints.

After that, you just click on the solver, a dialog will appear, then fill in the objective function as well as the constraints, and then click on solve. The optimum solution will appear. If you want you can generate three different reports: answer, sensitivity, and limit report.

Correlation and regression will be discussed, In the subsequent post.

Happiness: A Unique Inside Job!

Yemi Alesh in General
  ·   1 min read

Leave a Reply