Tuesday 27 February 2018

MicroStrategy Tutorial by Mindmajix

This tutorial gives you an overview and talks about the fundamentals of MicroStrategy.
1. Installation And Initial Setup
The first thing to know about this Microstrategy version  10 is that it contains an essential piece called Desktop.
For those coming from previous software generations (7, 8 or 9) MicroStrategy Desktop was the tool that was used to create the metadata and reports on a Windows (PC only) environment. From version 9.3 Desktop was renamed Developer. So don’t be confused with the new name.
The Desktop version in 10 is Multi Platform, that is, it remains a desktop application but now with the ability to be run on both Windows and MAC OSX but there is no difference in functionality between both versions.

1.1 Download

The first step is of course to download our version if we don’t have it yet.
Note: If you have a corporate access, you should ask your IT department to provide you with a copy of the software and the license.
.
1.2 Installation 
Once you have downloaded the corresponding installer we’ll proceed with its installation in our local computer.
After installation we’ll proceed to run Desktop and this will be the first glimpse we’ll get:
 
1.3 Setting Up Your Data Connection
In 10, we have several options to connect to our data sources, one of the main ones is the ability to connect to our MicroStrategy corporate server. That is, we can run Desktop in “solitary” to use data from different sources such as spreadsheet files, corporate databases or connect our desktop directly to the project we have assigned to upload our new data and then create our reports in this project.
For More information about Microstrategy/Microstrategy Administration visit Mindmajix
 
Author
Lianamelissa is Research Analyst at Mindmajix. A techno freak who likes to explore different technologies. Likes to follow the technology trends in market and write about them.

Monday 26 February 2018

What is Hadoop HIVE Query Language

Hive Query Language

 Hive QL is the HIVE QUERY LANGUAGE
 Hive offers no support for row level inserts, updates and deletes.
 Hive does not support transactions.
 Hive adds extensions to provide better performance in the context of hadoop and to integrate with custom extensions and even external programs.
 DDL and DML are the parts of HIVE QL
 Data Definition language (DDL) is used for creating, altering and dropping databases, tables, views, functions and indexes.
 Data manipulation language is used to put data into Hive tables and to extract data to the file system and also how to explore and manipulate data with queries, grouping, filtering, joining etc.
Accelerate your career with Hadoop Training and become expertise in Apache Hadoop.

Databases in Hive:

 The Data bases in the Hive is essentially just a catalog or name space of tables.
 They are very useful for larger clusters with multiple teams and users, as a way of avoiding table name
 Hive provides commands such as
CREATE DATA BASE db name  to create database in Hive
USE db name  To use the database in Hive.
DROP db name  To delete the database in Hive.
SHOW DATA BASE  to see the list of the DataBase
If no database is specified, tables belong to the default Data Base.

Tables in Hive:

     Hive table is logically made up of the data being stored and the associated metadata describing the layout of the data in the table.
 The data typically resides in HDFS, although it may reside on any Hadoop file system including the local file system.
 Hive stores the metadata in a relational database and not in HDFS.
 The command for creating a table in Hive is
 have>CREATE TABLE EMP (empid int, ename string, esal double)
ROW FORMAT DELIMITED FIELDS TERMINATED By ‘t’ LINES TERMINATED by ‘n’ STORED AS TEXT FILE;
 To display the description of the table we use have>desc emp;
 In have, we are having two types of tables
Managed tables
External tables
1. Managed tables
       Managed tables are the one which will be managed in the Hive warehouse i.e. whenever we create a managed table definition, it will be stored under the default location of the Hive warehouse i.e./user/Hive/ware house.
When we drop a managed table, Hive deletes the data in the table
Managed tables are less convenient for sharing with other tools.
Syntax for creating Hive managed table:-
Hive>create table manage- tab (empid, ename string, esal int) row format delimited fields terminated by ‘t’ lines terminated by ‘m’ stored as a text file;
 As discussed above, the table will be created under/user/Hive/ware house/managed-tab by giving the command as
#hadoop fs –ls/user/Hive/warehouse.
 How to load the data in managed tables
We can load the data in two ways
Local Mode
HDFS Mode
In local mode, the syntax is
hive>load data local in path’/home/new Batch/input1.txt’
Into table managed-tab;
For HDFS mode, the syntax is
hive>load data in path’/user/ramesh/Hive/input2.txt’
Into table managed – tab;
Once the successful loading of the table and once the file is loaded, the file will be deleted in HDFS path and we can see in use/Hive/ware house
2) External Tables:-
     Along with the managed tables, Hive also uses external tables.
     Whenever the key word ‘external’ comes in the table definition part. Hive will not bother about the table definition, i.e. the external table will not be managed by the Hive warehouse system.
     Along with the external keyword, we can also mention the ‘location’ in the table definition, where exactly the table definition will get stored.
     When you drop an external table, Hive leave the data untouched and only delete the meta data.
Syntax:-
Hive>create external table external- tab(empid int, ename string, esal double) 
row format delimited fields
Terminated by ‘f’ lines terminated by ‘n’ stored as text file location 

‘userRameshHive-external’;
??
?Location will be automatically created.

Loading data into External Tables:-

 Loading data from HDFS to
Hive>load data in path’/Ramesh/input data.txt’ into table external-tab;
 Flow of Data in Hive process at the sample location
If we delete the managed table, both the schema and the data file will be deleted.
But, if we delete external tables, only the schema will be deleted and data file will be there in the specified location.

Difference between managed tables & External Tables:-

      One of the main differences between managed and external tables in Hive is that when an external table is dropped, the data associated with it does not get deleted from only the meta data (no. of cols, types of cols, terminators etc.) gets dropped form the Hive meta store
 When a managed table gets dropped, both the metadata and data get dropped.
 I have so far always preferred making table external because if the schema of my Hive table changes, I can just drop the external table and recreate another external table over the same HDFS data with the new schema
Hive>Create external table log in for tab(log id int, log Error string,Log error count int)
row format delimited fields 
terminated by’f’ stored as text file location ‘user/external location’;
Hive>select*from log in for tab;
   
We get the result from the file which we specified in the location path
 For external tables, no need to load the data explicitly.
 However, most of the changes to schema can now be made through ALTER TABLE or similar command
 So, the recommendation to use external tables over managed tables might be more of a legacy concern than a contemporary one.

If you want More Visit Mindmajix
 
Author
Lianamelissa is Research Analyst at Mindmajix. A techno freak who likes to explore different technologies. Likes to follow the technology trends in market and write about them.

Introducing Robotic Process Automation

Robotics will change today’s workplace dramatically.
  • Creating Virtual Workforces

Robotics is growing predominantly, and with the arrival of bots, enterprises are enjoying the complete freedom to unlock the full potential of automation. Challenger start-ups are now opting automated design operating models to improve productivity and increase innovation.
Recent studies state that nearly 80 million jobs in US and 15 million jobs in UK are good to opt automation. This innovation saves almost 43% of worker’s time. RPA performs almost every kind of low-end task. This doesn’t necessarily diminish the human labour, but it specifically diverts the man energy into something which is more crucial and value-added.
Organizations now are taking a step forward in creating virtual workforces to fully digitalize their processes.
  • The Robotics Landscape

Typical uses of RPA

  • Double-data entry -  user rekeying of data and data entry is made easy from one system to another.
  • Application migration - migrates application data and records as part of an upgrade.
  • Automation of reports - automates data to provide accurate reports.
  • Rule-based decision making - RPA can handle decision matrices efficiently and arrive at simple rule-based decisions accurately.
  • Well-defined Processing - Automatically enters inputs from source systems into target systems.

The Journey to Robotic Process Automation

The first step towards Automation is a Pilot.
  • Scoping:

Rapid Piloting in Development Environment.
-> Select an enterprise where large number of processes are still handled manually.
-> Identify processes with digital triggers and high volumes with human exceptions.
  • Mapping:

-> Map the activities, apps, variations, to keystroke level.
-> Build a baseline and achieve performance targets accordingly.
-> Choose RPA steps in such a way so as to minimise the contact between humans and robots.
  • Automation:

-> Construct robots to work in an automated standard.
-> Employ agile development and on-going testing.
-> Define interactions and changes between robots and staff roles.
  • Testing/Handover:

-> Run test cases to identify and capture exceptions.
-> Migrate IT functionality to production environment.
-> Control on-going execution and support of RPA process.
-> Wire in sustainability.
-> Plan further.

The outcomes of the Pilot:

-> Demonstrate the benefits of automation to pre-described tasks and processes.
-> Guage the effectiveness of RPA technology with existing systems.
-> Highlights the technical and capability requirements for each automation programme.
-> Determine the risk, speed and ease of implementation.
-> Deliver decent financial benefits and value after each implementation.
-> Start the journey towards robotic workforce.

If you want More visit Mindmajix

 
Author

Lianamelissa is Research Analyst at Mindmajix. A techno freak who likes to explore different technologies. Likes to follow the technology trends in market and write about them.

Friday 23 February 2018

WorkFusion Interview Questions

If you're looking for WorkFusion Interview Questions for Experienced or Freshers, you are at right place. There are lot of opportunities from many reputed companies in the world. According to research WorkFusion has a market share of about 49.29%. So, You still have opportunity to move ahead in your career in WorkFusion Development. Mindmajix offers Advanced WorkFusion Interview Questions 2018 that helps you in cracking your interview & acquire dream career as WorkFusion Developer.
Q. In WorkFusion, what exactly Double Subscription is?
One of the very important aspects of a Business Process Management is Double subscription. There are cases when messages don’t deliver to the subscriber instantly due to the reasons such as user not available or is away from the covered region. In case messages are sent during such a situation, the same remains in queue and deliver when the user comes online. This phenomenon is generally called as double subscription. There are certain benefits that users can have through this approach in a true sense. 
Q. What are the tasks that can be automated with the help of Robotic Process Automation?
The fact is RPA is totally based on Software and users need to keep this thing in their mind. Several minor, as well as major tasks can be handled with the help of this once deployed. Maintenance of records, transaction management, as well as queries handling can simply be done with RPA. IT is also capable to manage complex business applications and data in the right manner. One of the best things is RPA supports a lot of platforms such as HTML and Java. Thus, users need not to worry about the compatibility issues. 
Q. Name the segments in which Business Process Management task can be derived without affecting the overall functionality of any sub-task associated with the Primary one?
The common segments are milestones, as well as lanes. Lanes are generally represented in the form of horizontal lines while on the other side the milestones are represented in the form of vertical lines in WorkFusion. 
Q. Tell whatever you know about RPA testing?
RPA testing is an approach that is extremely useful to assure quality outcomes. In case the level becomes more complex, it can be grouped into two categories. The very first thing is to test the Business Process Management through the RPA. Next factor to be paid attention to is the instruction. All of the instructions that are passed to the software robots should be verified and managed again. 
Q. What are some of the components of Business Process Management in WorkFusion you are familiar with?
The very first and in fact the most important one is Process Server which makes sure that all the tasks are centrally monitored, as well as controlled. In addition to this, another component that is extremely useful is Process Designer which is generally followed by the Process Center. Various console such as Admin Console and Data Warehouse are also the part of BPM in many cases. 
Enthusiastic about exploring the skill set of WorkFusion? Then, have a look at the WorkFusion TrainingCourse together additional knowledge. 
Q. In WorkFusion BPM, what exactly is an UCA?
It stands for Under Cover Agent which are generally used when it comes to sharing some authentic and secret or personal information among the users.UCA always make sure that no unauthorized subscriber catch the message. Basically, the original message is attached to it or kept inside the frame. 
Q. What exactly do you mean by serialization?
There are certain stages when XML needs to be converted into a format which is compatible with the Teamwork objective. This process is generally called as serialization. 
Q. What are the advantages of Robotic Process Automation?
There are certain benefits that users can always make sure of and these are:
* It is basically cost effective, 
* Faster
Have excellent consistency if everything is managed reliably, 
* Can boost customer satisfaction
* RPA is well-known for accuracy, as well as for quality
In addition to all above, one of the major factors that have contributed a lot for its success is improved Analytics. 
If you want more visit Mindmajix
Author
Lianamelissa is Research Analyst at Mindmajix. A techno freak who likes to explore different technologies. Likes to follow the technology trends in market and write about them.

Thursday 22 February 2018

Workday Tutorial

This Workday Tutorial talks about the basic functionalities,features and offerings of Workday software.

WORKDAY OFFERINGS

Workday is building a complete suite of on-demand products to help you run your business.
Human Capital Management-  Workday’s HCM suite includes Staffing, Absence, Benefits, Performance, Compensation, and Development.
Payroll – Workday is delivering a powerful new payroll product that allows you to group employees, manage payroll calculation rules, and pay employees according to their organizational, policy and reporting needs. We also offer the ability to integrate with leading outsourced payroll providers such as ADP Payforce.
Financial Management – Workday delivers all the core financial management processes including Financial Accounting and reporting, Resource management (procurement, asset management), Supplier accounts, customer accounts, cash management, and revenue accounts.
Worker Spend Management – Worker Spend Management gives you ability to track expenses, spending on contingent labor, and handle desktop procurement.
Workday Benefits Network – Delivers pre-built integrations to over 70 benefits providers.
Learn from Mindmajix experts with Real Time Scenarios, Curriculum, Free Demo for:
>> Workday Training
>> Workday Integration Training
>> Workday Payroll Training

WORKDAY ARCHITECTURE

Workday’s applications run entirely in-memory, in a highly object-oriented structure. Persistence is mainly for the sake of data safety …but not entirely. In earlier releases, Workday kept absolutely everything in RAM. However, certain things are kept only on disk, such as:
  • Audit files.
  • Certain documents (notably resumes)
The main components of Workday Architecture include :

UI Server provides the following:

  • Entry point for end users
  • Flash/Flex based
  • Wide Browser Support
  • UIs will be automatically generated for
  • iPhone
  • Mobile HTML
  • PDF export
  • Excel export

Object Management Server(OMS) is responsible for the following :

  • main processing engine
  • services UI and data requests that comes from the UI server and Integration Servers.

Integration Server :

Workday only talks to the outside world via web services.
  • Workday is heavily into SOAP (Simple Object Access Protocol).
  • It translates SOAP into whatever else might be needed for integration, and also does reliable delivery.
  • All that said, Workday sees integration among various SaaS offerings as an area needing significant future attention.
  • It also provides transformation services according to the data formats that it is integrating with.

Persistent Strore :

  • All the workday application changes will be captured in Database.
  • Workday uses GPLed MySQL/InnoDB.
  • This is responsible for data replications and Data backups.
Core concepts and Navigation Basics
Core Concept: Workday Foundation
  • supervisory organization :The Foundation of Workday HCM. This type organization groups worker into a management hierarchy.
  • Staffing Model: Defines how jobs and position are created and filled in a supervisory organization
  • Job Profiles: Defines generic features and characteristics of a job and a position that is built off that profile.
  • Compensation: The umbrella term for compensation packages grades grade profile and plans.
  • Security: A security group is a collection of user or a collection of objects that are related to users. Allowing a security group access to a securable item in a security policy grants access to the users associated with the security groups.
  • Business Process: A sequence of one or more tasks that accomplishes a desired business


If you want more visit MindMajix 


Author
Lianamelissa is Research Analyst at Mindmajix. A techno freak who likes to explore different technologies. Likes to follow the technology trends in market and write about them.


Tuesday 20 February 2018

Data Modeling

A vital part of practically every business is record keeping. In our data society, this has turned into an imperative part of business, and a significant part of the world's registering power is devoted to keeping up and utilizing databases.

Databases of assorted types plague relatively every business. A wide range of information, from messages and contact data to monetary information and records of offers, are put away in some type of a database. The journey is on for important capacity of less-organized data, for example, subject learning.

This is the first of a two-section article that will give a prologue to social databases and the SQL dialect. This initial segment depicts a portion of the key components of the innovation with an accentuation on database standardization. The second part will depict a less hypothetical way to deal with database outline, as well as give a prologue to the SQL dialect.

Why is Data Modeling Important?

Information displaying is presumably the most work escalated and tedious piece of the advancement procedure. Why trouble particularly on the off chance that you are in a hurry? A typical reaction by professionals who compose regarding the matter is that you should no more form a database without a data model than you should assemble a house without plans.
The information show is additionally sufficiently definite to be utilized by the database engineers to use as a "diagram" for building the physical database. The data contained in the information model will be utilized to characterize the social tables, essential and outside keys, put away methodology, and triggers. An inadequately planned database will require additional time in the long haul. Without cautious arranging you may make a database that precludes information required to make basic reports, produces comes about that are erroneous or conflicting, and can't suit changes in the client's prerequisites.

Data Modeling As Part of Database Design

The information demonstrate is one a player in the reasonable outline process. The other is the capacity display. The information demonstrate centers around what information ought to be put away in the database while the capacity show manages how the information is handled. To place this with regards to the social database, the information display is utilized to plan the social tables. The practical model is utilized to plan the questions that will get to and perform tasks on those tables.

Information displaying is gone before by arranging and examination. The exertion gave to this stage is relative to the extent of the database. The arranging and investigation of a database proposed to serve the necessities of an undertaking will require more exertion than one planned to serve a little work gathering.

If you want more visit MindMajix

Author

Lianamelissa is Research Analyst at Mindmajix. A techno freak who likes to explore different technologies. Likes to follow the technology trends in market and write about them.

Monday 19 February 2018

Integrated Business Planning (IBP)

Integrated Business Planning (IBP) can most basically be depicted as cutting edge, or people to come, deals and tasks arranging (S&OP). IBP speaks to the advancement of S&OP from its creation arranging establishes in the 1970s into a completely incorporated administration and store network cooperation. Throughout the years, regular S&OP has normally been allocated as a production network process, adjusting free market activity over a one - to year skyline, with no genuine money related joining; the attention on the here and now and on the numbers, instead of the issues. IBP is the business arranging process for the post-recessionera ; joining is the thing that recognizes it from its forerunner, and it brings with ita really key viewpoint. Driven by the executive group, IBP is a typical sense process intended for successful decision making. It permits senior administration toplan and deal with the whole association over a two year skyline or additionally, adjusting key and strategic plans every month, and allotting basic assets – individuals, gear, stock, materials, time, and cash – to fulfill clients in the most gainful way

principle goals of IBP

An effective IBP process enables an association to anticipate achievement and after that adjust the whole association to execute against this basic arrangement. Key to the IBP procedure is a moving two year incorporated volume and money related arrangement ('one arrangement of numbers') that covers the item portfolio, request, and supply.It builds up a base up practical arrangement and guarantees early spotlight on any potential holes in business execution against targets, strategies for success, objectives, and key designs, enabling associations to anticipate and react emphatically to evolving conditions, in a lot of time. So whatever lies ahead, IBP can enable you to get ready for it.

challenges

Comprehension, initiative, and responsibility. The greatest test in executing an IBP procedure, or changing from S&OP to IBP, is for associations to appreciate the genuine size of what the procedure can accomplish for the business, and for the authority group to be set up to focus on another style of maintaining the business. At that point comes the size able test of defeating the conventional reasoning of working in useful storehouses, and to incorporate all the key zones of the business (Product, Sales, Marketing, Supply, Finance, R&D, and so on.) into a solitary Integrated Business Planning process.



If you want more about sap ibp visit Mindmajix



Author

Lianamelissa is Research Analyst at Mindmajix. A techno freak who likes to explore different technologies. Likes to follow the technology trends in market and write about them.

Thursday 15 February 2018

Using the SAS Information Delivery Portal - SAS BI

The sas information delivery portal is started from your browser by entering a specific URL.
On the classroom machines you use: https://localhost:8080/Portal

Using the SAS Information Delivery Portal

This demonstration shows how to use the sas information delivery portal.
Open an Internet Explorer window and enter the following URL: https://localhost:8080/Portal
The SAS Information Delivery Portal Public Kiosk is displayed. This is the default page that all users see before they sign on to the portal:
2. Select Log On to sign on to the portal.
3. Enter the User name and Password provided by the instructor.










4. Select  The initial content displayed contains one page with two portlets:
A collection portlet is a portlet that contains a list of portal content items. To use a collection portlet, click the name of an item in the list. Depending on the content type, the portal will either display the item or launch a new application in your browser window.
A bookmark portlet is a predefined portlet that enables you to maintain a list of content items that you want to refer to later. Bookmarks are generally used to maintain a list of content items for short-term use.

If you want more about sas bi visit MindMajix
 
Author

Lianamelissa is Research Analyst at Mindmajix. A techno freak who likes to explore different technologies. Likes to follow the technology trends in market and write about them.

Monday 12 February 2018

Oracle Golden Gate Checkpoint Mechanism

Golden Gate has its own checkpoint instrument to deal with any blackouts amid the replication procedure. Checkpoints store the present read and compose places of a procedure to plate for recovery purposes. Checkpoints guarantee that information changes that are set apart for synchronization actually are caught by Extract and connected to the objective by Replica, and they prevent redundant handling. They give adaptation to non-critical failure by keeping the loss of information should the framework, the system, or an Oracle Golden Gate process should be restarted. For complex synchronization setups, checkpoints empower various Extract or Replica procedures to peruse from a similar arrangement of trails.

Checkpoints work with bury process affirmations to keep messages from being

lost in the system. Prophet GoldenGate has an exclusive ensured message conveyance innovation.


Extract Checkpointing

Extract makes checkpoints for its positions in the information source and in the trail. Since Extract just catches submitted transactions, it must monitor activities in every open exchange, if any of them are conferred. This expects Extract to record a checkpoint where it is at present perusing in an exchange log, in addition to the position of the beginning of the most seasoned open exchange, which can be in the ebb and flow or any previous log.

To control the measure of exchange log that must be re-prepared after a blackout, Extract holds on the present state and information of handling to plate at particular interims, including the state and information (assuming any) of long - running exchanges. On the off chance that Extract stops after one of these interims, it can recuperate from a position inside the past interim or at the last checkpoint, rather than returning to the log position where the most seasoned open long running exchange initially showed up.

Checkpoints for End of Last Committed exchange written to trail and Start of Oldest Uncommitted exchange (An) in logare occasion based checkpoints, which are recorded on submit and beginning of an exchange.

Though, checkpoint for Last read position is recorded occasionally to control the measure of exchange log that must be re-handled after a blackout.
               
                                 

Replicat Checkpointing

Replica makes checkpoints for its position in the trail. Replica stores its checkpoints in a checkpoint table in the objective database to couple the submit of its exchange with its position in the trail document. The checkpoint table ensures consistency after a database recuperation by guaranteeing that an exchange may be connected once, regardless of whether there is a disappointment of the Replica procedure or the database procedure. For detailing purposes, Replica additionally has a checkpoint record on disk in the dirchk sub - registry of the Oracle GoldenGate index.

Checkpoints are not required for non - persistent sorts of arrangements that can be re - keep running from a begin point if necessary, for example, starting burdens.

                                 


If you want more visit MindMajix
                                   

Author

Lianamelissa is Research Analyst at Mindmajix. A techno freak who likes to explore different technologies. Likes to follow the technology trends in market and write about them.