Install Mini Sap Basis Consultant

• A vendor offers you a material at gross price (PB00) of EUR 1200. In addition, the vendor gives you a 15% discount (RB01) and a 5% cash discount(SKTO).

The vendor charges 90 for freight costs (FRB1). What is the effective price if you use the calculation schema shown in the attached graphic? Level Counter Condition type Description From 1 1 PB00 Gross Price 10 1 RB01 Discount% 1 15 1 ZC01 Surcharge% 1 20 0 Net value 30 1 FRB1 Absolute freight amount 20 35 1 SKTO Cash discount 20 40 0 Effective price • EUR 1,059 • EUR 1,042 • EUR 1,032 • EUR 1,050.

Background Leading SAP authority Glynn C Williams, author of “Implementing SAP R3 Sales and Distribution” and “Implementing SAP ERP Sales and Distribution” has released his personal library of more than 230 SAP Tips and Tricks, compiled after implementing SAP in more than 39 countries and consulting in more than 5 functional domains. These Tips and Tricks are cross functional and easy to reference, empowering the reader with valuable time saving tips. Even at $1/tip the book should retail at $230. 'The Advanced SAP Consultants Handbook' will enhance any readers SAP Skill set.

Hi guys, may someone please assist me with the link on how to download MiniSAP, I just got my SAP BASIS certificate and want to have some practice, I will highly going to appreciate your help. Follow the installation intructions, you can find in the root directory where you decompressed the download file.

Synopsis • The 'Advanced SAP Consultants Handbook', is a summary of tips and tricks gained on more than 18 SAP projects over a period of more than a decade. • It contains more than 230 SAP Tips and Tricks, with more than 200 screenshots. • It is designed for the reader to obtain the fundamentals of the SAP tip, trick within minutes, in most cases using a single page per tip in the 364 page book.

Install Mini Sap Basis ConsultantInstall Mini Sap Basis Consultant

• The reader learns valuable, advanced, time saving SAP Tips and Tricks, not taught in training centres. • This is the same reference manual that is used by countless professionals on countless SAP projects worldwide. Inbound Delivery Inbound Delivery: Definition: “Inbound Delivery (ID) is a record which is holding all the information / data required to start and monitor the inbound delivery process. ID process begins with goods receipt in the yard and closes on moving of the goods at final putaway.” Actually it is notification from vendor against PO of the delivery of goods at specific dates.

Creation of inbound Deliveries: If your vendor gives you Advance Shipping Notification before actual goods received, so in this situation inbound delivery is created. So place can be reserved in storage location for the goods.

E.g.: If a PO is issued to vendor where you specify the delivery date of required material, So vendor after view the PO and inform you that due to short of resources unable to delivered the required material the dates mentioned in PO. So need one week more time to delivered the products. So Inbound delivery is created against the PO issued to that vendor with mutually agreed dates of required goods and these dates mention inbound delivery. The inbound delivery can be created using Transaction Code VL31N.

The initial screen appears to create inbound delivery. Enter the right vendor and the delivery date, the system automatically propose that current date as a delivery date, if you want to create a inbound delivery for specific PO than enter PO number, all system will find all the POs due for inbound delivery automatically. So after that overview screen appeared and the Purchase Order data is copied into inbound delivery as shown in below screen shot. And some addition data can be input in the Header and item level screen, (E.g transportation planning, route etc). After doing some necessary changes save the inbound delivery and when system saved the inbound delivery it generate inbound delivery’s number. All the stages of an external procurement process are involved in the inbound process that happens when goods are received.

Basically Inbound Delivery in a Follow On Activity to the purchase order. The inbound delivery can be created using Transaction Code VL31N. The process starts when the goods are presented to vendor shipping point and it ends when the posting of the goods receipt is done by receiver end. This process has some steps after creation of PO and which are the following. • Notification • Inbound Delivery • Subsequent Putaway of goods • Posting of goods receipt Process Flow: • ME21N – In Purchase Order item level choose confirmation tab and select confirmation control key “Inbound Delivery”. • Create inbound delivery (VL31N) against Purchase Order. • Through MIGO Goods Receipt against Inbound Delivery.

Accounting Difference:- If you view the Accounting document of material document where the goods are received without Inbound Delivery. So please is accounting details. 300000 Inventory - Raw Mate 191100 Goods Rcvd/Invoice R 281000 Income - price varia Accounting document of material document where the goods are received with Inbound Delivery. 300000 Inventory - Raw Mate 191100 Goods Rcvd/Invoice R 231000 Loss - price varianc If you view the both accounting document, there is only one difference that is price variance account. One shows income and other shows loss. Edwin Gentzler Contemporary Translation Theories Pdf Printer. Predicting BW Database Volume.

Revisiting the Technical Content in BW Administration Cockpit with SAP Predictive Analysis The following blog post demonstrates how to use the technical content of SAP BW as a forecast data basis for a prognosis model in SAP Predictive Analysis. The aim is to show a smooth and straight-forward process avoiding additional modelling outside of BW as much as possible. In the described use case the Database Volume Statistics have been chosen as an example.

The official SAP Help summarizes the Technical Content in BW Administration Cockpit as follows: “ The technical BI Content contains objects for evaluating the runtime data and status data of BW objects and BW activities. This content is the basis for the BW Administration Cockpit, which supports BW administrators in monitoring statuses and optimizing performance.

” The Technical Content with its pre-delivered Web Reporting might look a bit old-fashioned nevertheless the variety, quality, and quantity of data which is “generated” at any time in the system is very useful and important for further analysis. The type of data has a strong focus on performance-related data (e.g. Query runtimes, loading times) but also other system-related data like volume statistics are available.

BW on Hana and SAP Predictive Analysis together are extending the possibilities how to see the data and what to do (potentially more) with it. Technically there are simply the following 3 steps to follow: • Expose cube information model to Hana (SAP BW) • Adjust data types to PA-specific format (Hana Studio) • Create forecast model (SAP PA Studio) The Database Volume statistics in the technical content are designed with a simple data model consisting of just one cube with some characteristics (day, week, month, DB object, object type, DB table etc.) and key figures (DB size in MB, number of records etc.). Following the above steps with this set of data, choosing a certain type of algorithm, results in a bar chart shown below integrated with forecast figures for the past and some months into the future.

The blue bars represent the actual database size by month. The green line represents the calculated figures of the forecast model (in this case a Double Exponential Smooth regression) for the past 20 months and 10 months into the future. Hello Friends of Integrated Planning, thank you very much for all the feedback I received on the File Upload/Download how-to over the past years. I have great news: Basically every development request has been implemented!

Yes, this means that there is a big load of new features available with version 3. Upload and download of CSV files and a new user interface that allows the preview of the file and plan data before you save it are just two of the highlights. The new version is also compatible with SAP NetWeaver BW 7.4. Ufc Undisputed 3 Caf Max Stats Download Firefox. Prerequisites Minimum release to use the new version is SAP NetWeaver BW 7.30.

Download You can download the complete how-to guide as well as the solution from. Big data has been a very revolutionary area particularly due to the load or bulk of data processing and it's underlying usage on the social platform. A technology base like SAP primarily due to it's huge adoption user base could benefit with any integration on Big data storage and processing fronts. There are interesting articles present on the web which depict the current trends in this field. Putting down links to some interesting reads on the this topic - Big data and SAP - HANA Vs Hadoop - HANA Hadoop Integration - Sources: scn, sap & hana.

Bg data & SAP. Big data has been a very revolutionary area particularly due to the load or bulk of data processing and it's underlying usage on the social platform. A technology base like SAP primarily due to it's huge adoption user base could benefit with any integration on Big data storage and processing fronts.

There are interesting articles present on the web which depict the current trends in this field. Putting down links to some interesting reads on the this topic - Big data and SAP - HANA Vs Hadoop - HANA Hadoop Integration - Sources: scn, sap & hana. RESEARCH AND VALIDATION TECHNIQUES. SAP Fiori - User using SAP on a web browser - Simplfied!! SAP Fiori would perhaps be a very simplified version provided to a user using SAP. SAP Fioiri sits on NetWeaver gateway and offeres out of box business rich process capabilities by leveraging your existing platform and mobilising through the use of browser not through mobile platform. You can create Sales Order, Approve Purchase Order and 25 such profiles all on a web browser connected with SAP backend.

Here is a demonstration example: On your internet browser on phone or laptop open the link (after proper installations). Then a link on the browser will prompt for a user to enter user id and password, the same authorisation you use in the SAP backend system.

Depending on your profile you will be prompted for selecting profiles from 1 to 25. You can then be provided with the ability to create Sales Order from scratch on the web browser, enter shipping instructions etc and confirm The best part is you can execute the Sales order on a web browser using mobile/laptop/desktop as it is built on HTML5 and a mobile responsive site. SAP HANA Architecture. • Index Server contains the actual data and the engines for processing the data.

It also coordinates and uses all the other servers. • Name Server holds information about the SAP HANA databse topology. This is used in a distributed system with instances of HANA database on different hosts. The name server knows where the components are running and which data is located on which server. • Statistics Server collects information about Status, Performance and Resource Consumption from all the other server components. From the SAP HANA Studio we can access the Statistics Server to get status of various alert monitors. • Preprocessor Server is used for Analysing Text Data and extracting the information on which the text search capabilities are based.

• XS Engine is an optional component. Using XS Engine clients can connect to SAP HANA database to fetch data via HTTP. The SAP HANA Index Server performs 7 key functions to accelerate and optimize analytics. Together, these functions provide robust security and data protection and enhanced data access. • Connection and Session Management – This component initializes and manages sessions and connections for the SAP HANA Database using pre-established Session Parameters. SAP has long been known for excellence in session management through its integration of SAPRouter into the SAPGUI product used as a front end for accessing the ABAP stack.

SAP HANA retains the ability to configure Connection and Session management parameters to accommodate complex security and data transfer policies instituted. • Authentication – User and role-based privileges are authenticated by the SAP HANA Database. (The Users, Authorizations and Roles within the SAP ERP system are not applicable or transportable to the SAP HANA instance.) The SAP HANA authentication model allows granting of privileges to users or roles, and a privilege grants the right to perform a specified SQL operation on a specific Object. SAP HANA also utilizes a set of Analytic Privileges that represent filters or hierarchy drilldown limitations for analytic queries to protect sensitive data from unauthorized users. This model enforces “Segregation of Duty” for clients that have regulatory requirements for the security of data.

• SQL Processor – The SQL Processor segments data queries and directs them to specialty query processing engines for optimized performance. It also ensures that SQL statements are accurately authored and provides some error handling to make queries more efficient. The SQL processor contains several engines and processors that optimize query execution: • The Multidimensional Expressions (MDX) Engine is queries and manipulates the multidimensional data stored in OLAP (OnLine Analytical Processing) data cubes. • The Planning Engine enables the basic planning operations within the SAP HANA Database for financial planning operations. • The Stored Procedure Processor executes procedure calls for optimized processing without reinterpretation. Converting a standard InfoCube into an SAP HANA Optimized Infocube) • The Calculation Engine converts data into Calculation Models and creates a Logical Execution Plans to support parallel processing.

• Relational Stores – SAP has further segmented the storage of In-Memory data into compartments within memory for speedier access. Data not needed immediately is stored on a Physical Disk as opposed to RAM. This allows quick access to the most relevant data. The SAP HANA Database houses four relational stores that optimize query performance: • • The Row Store stores data in a row-type fashion and is optimized for high performance of write operation, and is derived from the P-Time “In Memory System” which was acquired by SAP in 2005. The Row Store is held fully in RAM. • The Column Store stores data in a column-type fashion and is optimized for high performance of write operation, and is derived from TREX (Text Retrieval and Extraction) which was unveiled by SAP in the SAP NetWeaver Search and Classification product. This technology was further developed into a full relational column based store.

The Column Store is held fully in RAM. • The Object Store is an integration of SAP Live Cache Technology into the SAP HANA Database. • The Disk Based Store is used for data that does not need to be held in memory and is best used for “tracing data” or old data that is no longer used. Disk Based Store is located on a hard disk is pulled into RAM as needed. • Transaction Manager – The SAP HANA Database processes individual SQL statements as transactions.

The Transaction Manager controls and coordinates transactions and sends relevant data to appropriate engines and to the Persistence Layer. This segmentation simplifies administration and troubleshooting. • Persistence Layer – The Persistence Layer provides built-in disaster recovery for the SAP HANA Database. The algorithms and technology is based on concepts pioneered by MAX DB and ensures that the database is restored to the most recent committed state after a planned or unplanned restart. Backups are stored as Save Points in the Data Volumes via a Save Point Coordinator which is typically set to backup every five to ten minutes. Any change points that occur after a save point are designated as un-committed transactions and are stored in the Transaction Log Volume.

Typically, these volumes are saved to media and shipped offsite for a cold-backup disaster recovery remedy. • Repository - The Repository manages the versioning of Metadata Objects such as Attribute, Analytic Views and Stored Procedure. It also enables the import and export of Repository content SAP HANA CONNECTIVITY OVERVIEW Now let us check the architecture components of SAP HANA Index Server. SAP HANA Index Server Architecture: • Connection and Session Management component is responsible for creating and managing sessions and connections for the database clients. Once a session is established, clients can communicate with the SAP HANA database using SQL statements. For each session a set of parameters are maintained like, auto-commit, current transaction isolation level etc. Users are Authenticated either by the SAP HANA database itself (login with user and password) or authentication can be delegated to an external authentication providers such as an LDAP directory.

• The client requests are analyzed and executed by the set of components summarized as Request Processing And Execution Control. The Request Parser analyses the client request and dispatches it to the responsible component. The Execution Layer acts as the controller that invokes the different engines and routes intermediate results to the next execution step.For example, Transaction Control statements are forwarded to the Transaction Manager. Data Definition statements are dispatched to the Metadata Manager and Object invocations are forwarded to Object Store. Data Manipulation statements are forwarded to the Optimizer which creates an Optimized Execution Plan that is subsequently forwarded to the execution layer.

• The SQL Parser checks the syntax and semantics of the client SQL statements and generates the Logical Execution Plan. Standard SQL statements are processed directly by DB engine. • The SAP HANA database has its own scripting language named SQLScript that is designed to enable optimizations and parallelization. SQLScript is a collection of extensions to SQL.

SQLScript is based on side effect free functions that operate on tables using SQL queries for set processing. The motivation for SQLScript is to offload data-intensive application logic into the database. • Multidimensional Expressions (MDX) is a language for querying and manipulating the multidimensional data stored in OLAP cubes. • The SAP HANA database also contains a component called the Planning Engine that allows financial planning applications to execute basic planning operations in the database layer.

One such basic operation is to create a new version of a dataset as a copy of an existing one while applying filters and transformations. For example: Planning data for a new year is created as a copy of the data from the previous year. This requires filtering by year and updating the time dimension.

Another example for a planning operation is the disaggregation operation that distributes target values from higher to lower aggregation levels based on a distribution function. • The SAP HANA database also has built-in support for domain-specific models (such as for financial planning) and it offers scripting capabilities that allow application-specific calculations to run inside the database. The SAP HANA database features such as SQLScript and Planning operations are implemented using a common infrastructure called the Calc engine. The SQLScript, MDX, Planning Model and Domain-Specific models are converted into Calculation Models. The Calc Engine creates Logical Execution Plan for Calculation Models. The Calculation Engine will break up a model, for example some SQL Script, into operations that can be processed in parallel.

The engine also executes the user defined functions. • In HANA database, each SQL statement is processed in the context of a transaction. New sessions are implicitly assigned to a new transaction. The Transaction Manager coordinates database transactions, controls transactional isolation and keeps track of running and closed transactions. When a transaction is committed or rolled back, the transaction manager informs the involved engines about this event so they can execute necessary actions. The transaction manager also cooperates with the persistence layer to achieve atomic and durable transactions. • Metadata can be accessed via the Metadata Manager.

The SAP HANA database metadata comprises of a variety of objects, such as definitions of relational tables, columns, views, and indexes, definitions of SQLScript functions and object store metadata. Metadata of all these types is stored in one common catalog for all SAP HANA database stores (in-memory row store, in-memory column store, object store, disk-based).

Metadata is stored in tables in row store. The SAP HANA database features such as transaction support, multi-version concurrency control, are also used for metadata management. In distributed database systems central metadata is shared across servers. How metadata is actually stored and shared is hidden from the components that use the metadata manager. • The Authorization Manager is invoked by other SAP HANA database components to check whether the user has the required privileges to execute the requested operations. SAP HANA allows granting of privileges to users or roles.

A privilege grants the right to perform a specified operation (such as create, update, select, execute, and so on) on a specified object (for example a table, view, SQLScript function, and so on).The SAP HANA database supports Analytic Privileges that represent filters or hierarchy drilldown limitations for analytic queries. Analytic privileges grant access to values with a certain combination of dimension attributes. This is used to restrict access to a cube with some values of the dimensional attributes. • Database Optimizer gets the Logical Execution Plan from the SQL Parser or the Calc Engine as input and generates the optimised Physical Execution Plan based on the database Statistics. The database optimizer which will determine the best plan for accessing row or column stores. • Database Executor basically executes the Physical Execution Plan to access the row and column stores and also process all the intermediate results.

• The Row Store is the SAP HANA database row-based in-memory relational data engine. Optimized for high performance of write operation, Interfaced from calculation / execution layer. Optimised Write and Read operation is possible due to Storage separation i.e. Transactional Version Memory & Persisted Segment. • Transactional Version Memory contains temporary versions i.e.

Recent versions of changed records. This is required for Multi-Version Concurrency Control (MVCC). Write Operations mainly go into Transactional Version Memory. INSERT statement also writes to the Persisted Segment.

• Persisted Segment contains data that may be seen by any ongoing active transactions. Data that has been committed before any active transaction was started. • Version Memory Consoliation moves the recent version of changed records from Transaction Version Memory to Persisted Segment based on Commit ID.

It also clears outdated record versions from Transactional Version Memory. It can be considered as garbage collector for MVCC. • Segments contain the actual data (content of row-store tables) in pages. Row store tables are linked list of memory pages. Pages are grouped in segments.

Typical Page size is 16 KB. • Page Manager is responsible for Memory allocation. It also keeps track of free/used pages. • The Column Store is the SAP HANA database column-based in-memory relational data engine. Parts of it originate from TREX (Text Retrieval and Extraction) i.e SAP NetWeaver Search and Classification.

For the SAP HANA database this proven technology was further developed into a full relational column-based data store. Efficient data compression and optimized for high performance of read operation, Interfaced from calculation / execution layer. Optimised Read and Write operation is possible due to Storage separation i.e. Main & Delta. • Main Storage contains the compressed data in memory for fast read.

• Delta Storage is meant for fast write operation. The update is performed by inserting a new entry into the delta storage. • Delta Merge is an asynchronous process to move changes in delta storage into the compressed and read optimized main storage. Even during the merge operation the columnar table will be still available for read and write operations. To fulfil this requirement, a second delta and main storage are used internally.

• During Read Operation data is always read from both main & delta storages and result set is merged. Engine uses multi version concurrency control (MVCC) to ensure consistent read operations.

• As row tables and columnar tables can be combined in one SQL statement, the corresponding engines must be able to consume intermediate results created by each other. A main difference between the two engines is the way they process data: Row store operators process data in a row-at-a-time fashion using iterators. Column store operations require that the entire column is available in contiguous memory locations. To exchange intermediate results, row store can provide results to column store materialized as complete rows in memory while column store can expose results using the iterator interface needed by row store. • The Persistence Layer is responsible for durability and atomicity of transactions.

It ensures that the database is restored to the most recent committed state after a restart and that transactions are either completely executed or completely undone. To achieve this goal in an efficient way the per-sistence layer uses a combination of write-ahead logs, shadow paging and savepoints. The persistence layer offers interfaces for writing and reading data.

It also contains SAP HANA ‘s logger that manages the transaction log. Log entries can be written implicitly by the persistence layer when data is written via the persistence interface or explicitly by using a log interface. Distributed System and High Availability The SAP HANA Appliance software supports High Availability. SAP HANA scales systems beyond one server and can remove the possibility of single point of failure. So a typical Distributed Scale out Cluster Landscape will have many server instances in a cluster. Therefore Large tables can also be distributed across multiple servers. Again Queries can also be executed across servers.

SAP HANA Distributed System also ensures transaction safety. Features • N Active Servers or Worker hosts in the cluster. • M Standby Server(s) in the cluster. • Shared file system for all Servers. Serveral instances of SAP HANA share the same metadata.

• Each Server hosts an Index Server & Name Server. • Only one Active Server hosts the Statistics Server. • During startup one server gets elected as Active Master. • The Active Master assigns a volume to each starting Index Server or no volume in case of cold Standby Servers. • Upto 3 Master Name Servers can be defined or configured.

• Maximum of 16 nodes is supported in High Availability configurations. Name Server Configured Role Name Server Actual Role Index Server Configured Role Index Server Actual Role Master 1 Master Worker Master Master 2 Slave Worker Slave Master 3 Slave Worker Slave Slave Slave Standby Standby Failover • High Availability enables the failover of a node within one distributed SAP HANA appliance.

Failover uses a cold Standby node and gets triggered automatically. So when a Active Server X fails, Standby Server N+1 reads indexes from the shared storage and connects to logical connection of failed server X. • If the SAP HANA system detects a failover situation, the work of the services on the failed server is reassigned to the services running on the standby host. The failed volume and all the included tables are reassigned and loaded into memory in accordance with the failover strategy defined for the system. This reassignment can be performed without moving any data, because all the persistency of the servers is stored on a shared disk. Data and logs are stored on shared storage, where every server has access to the same disks. • The Master Name Server detects an Index Server failure and executes the failover.

During the failover the Master Name Server assigns the volume of the failed Index Server to the cold Standby Server. In case of a Master Name Server failure, another of the remaining Name Servers will become Active Master. • Before a failover is performed, the system waits for a few seconds to determine whether the service can be restarted. Standby node can take over the role of a failing master or failing slave node.

Best Practice to publish SAP Syclo Code Following steps explains a scenario on the best practice to pusblish an agentry code when developers are located in different countries 1. Developer (Developer A, Developer B and Developer C located in 3 different countries) 1.1 unit test their code using their local environment (Eclipse, Agentry server) 1.2 commit their code to central repository (Agentry share, svn) once unit tested.

=>This will ensure there is no impact on testers or other developers For future release for example Sprint 2 ( In Syclo Terminology) 2. Demo sprint 2 builder ( Developer A and B) 2.1 Publish Agentry application to central Agentry server 2.2 export Java code to central Agentry server 2.3 restart central Agentry server =>Build a coherent application, possibility to revert to a previous working level 3. Functionality Testers (X, Y and Z.) 3.1 Run functional tests of the build using an Agentry client on their laptop connected to the central Agentry server.

BC T Codes & Tables & Function. SAP BC TRANSACTIONS / T CODES MODULE LSMW Legacy System Migration Workbench. An addon available from SAP that can make data converstion a lot easier. Thanks to Serge Desland for this one. DI02 ABAP/4 Repository Information System: Tables. OSS1 SAP Online Service System OY19 Compare Tables SM13 Update monitor.

Will show update tasks status. Very useful to determine why an update failed. S001 ABAP Development Workbench S001 ABAP/4 Development Weorkbench S002 System Administration SA38 Execute a program SCAT Computer Aided Test Tool SCU0 Compare Tables SE01 Old Transport & Corrections screen SE03 Groups together most of the tools that you need for doing transports. In total, more than 20 tools can be reached from this one transaction. SE09 Workbench Organizer SE10 New Transport & Correction screen SE11 ABAP/4 Dictionary Maintenance SE12 ABAP/4 Dictionary Display SE13 Maintain Technical Settings (Tables) SE12 Dictionary: Initial Screen - enter object name SE13 Access tables in ABAP/4 Dictionary SE14 Utilities for Dictionary Tables SE15 ABAP/4 Repository Information System SE16 Data Browser: Initial Screen SE16N Table Browser (the N stands for New, it replaces SE16).

Provided by Smijo Mathew. SE17 General Table Display SE24 Class Builder SE30 ABAP/4 Runtime Analysis SE32 ABAP/4 Text Element Maintenance SE35 ABAP/4 Dialog Modules SE36 ABAP/4: Logical Databases SE37 ABAP/4 Function Modules SE38 ABAP Editor SE39 Splitscreen Editor: Program Compare SE41 Menu Painter SE43 Maintain Area Menu SE48 Show program call hierarchy. Very useful to see the overall structure of a program.

Thanks to Isabelle Arickx for this tcode. SE49 Table manipulation. Show what tables are behind a transaction code. Thanks to Isabelle Arickx for this tcode. SE51 Screen Painter: Initial Screen SE54 Generate View Maintenance Module SE61 R/3 Documentation SE62 Industry utilities SE63 Translation SE64 Terminology SE65 R/3 document. Short text statistics SE66 R/3 Documentation Statistics (Test!) SE68 Translation Administration SE71 SAPscript layout set SE71 SAPScript Layouts Create/Change SE72 SAPscript styles SE73 SAPscript font maintenance (revised) SE74 SAPscript format conversion SE75 SAPscript Settings SE76 SAPscript Translation Layout Sets SE77 SAPscript Translation Styles SE80 ABAP/4 Development Workbench SE81 SAP Application Hierarchy SE82 Customer Application Hierarchy SE83 Reuse Library.

Provided by Smiho Mathew. SE84 ABAP/4 Repository Information System SE85 ABAP/4 Dictionary Information System SE86 ABAP/4 Repository Information System SE87 Data Modeler Information System SE88 Development Coordination Info System SE91 Maintain Messages SE92 Maintain system log messages SE93 Maintain Transaction SEARCH_SAP_MENU From the SAP Easy Access screen, type it in the command field and you will be able to search the standard SAP menu for transaction codes / keywords. It will return the nodes to follow for you. SEU Object Browser SHD0 Transaction variant maintenance SM04 Overview of Users (cancel/delete sessions) SM12 Lock table entries (unlock locked tables) SM21 View the system log, very useful when you get a short dump.

Provides much more info than short dump SM30 Maintain Table Views SM31 Table Maintenance SM32 Table maintenance SM35 View Batch Input Sessions SM37 View background jobs SM50 Process Overview SM51 Delete jobs from system (BDC) SM62 Display/Maintain events in SAP, also use function BP_EVENT_RAISE SMEN Display the menu path to get to a transaction SMOD/CMOD Transactions for processing/editing/activating new customer enhancements. SNRO Object browser for number range maintenance SPRO Start SAP IMG (Implementation Guide) SQ00 ABAP/4 Query: Start Queries SQ01 ABAP/4 Query: Maintain Queries SQ02 ABAP/4 Query: Maintain Funct. Areas SQ03 ABAP/4 Query: Maintain User Groups SQ07 ABAP/4 Query: Language Comparison ST05 Trace SQL Database Requests ST22 ABAP Dump analysis SU53 Display Authorization Values for User WEDI EDI Menu. IDOC and EDI base. EINA Purchasing Info Record- General Data EINE Purchasing Info Record- Purchasing Organization Data MAKT Material Descriptions MARA General Material Data MARC Plant Data for Material MARD Storage Location Data for Material MAST Material to BOM Link MBEW Material Valuation MKPF Header- Material Document MSEG Document Segment- Material MVER Material Consumption MVKE Sales Data for materials RKPF Document Header- Reservation T023 Mat.

Tcodes Description ME01 Maintain Source List ME03 Display Source List ME04 Changes to Source List ME05 Generate Source List ME06 Analyze Source List ME07 Reorganize Source List ME08 Send Source List ME0M Source List per Material ME11 Create Purchasing Info Record ME12 Change Purchasing Info Record ME13 Display Purchasing Info Record ME14 Changes to Purchasing Info Record ME15 Flag Purch. For Deletion ME16 Purchasing Info Recs.

We raise invoice to a customer. Now if customer pays with in 21 days he gets 10% disc and if he pays with in 30 days he gets 5% disc, How do we configure thisscenario? Answer In the above scenario, we can choose payment terms as one of the fields in the condition table: Payment terms can be defined as follows: NT21 Within 21 days Due net (For NT21 the customer would get 10% discount) NT30 Within 30 days Due net (For NT25 the customer would get 5% discount) Upon selecting the relevant payment terms system would determine the percentage discount in the document. ------------------------------------------------------------------------------------------------ Now If the scenario is 'unless the customer pays the amount, payment date is not known hence we don't know which payment term to use and which discount to apply' as u mentioned.

Payment terms =>21 days 10% cash discount; 30 days 5% cash discount. Then we have to use the condition type SKTO, it is a special condition type used strictly for this scenario i.e based on which payment term discount should be applicable. This condition type is not passed to accounting and generally not to COPA either (as you can see no Act keys for this condition type is not maintained in the Pricing Procedure) The condition category E cash (in V/06) discount tells the system to go get the payment terms and calculate the potential/actual value i.e.

10% within 21 days and 5% within 3 0 days. Based on the differing payment terms while payment, Invoice value will not change and would be the same, but SKTO will correct the value and discount is calculated in A/R instead.

SMS is a killer app. It’s so simple. There’s no app to download.

There’s no unexpected crashing. You don’t have to have a smartphone. In a world with a lot of complicated technology, I love to see companies still finding creative, useful ways to use SMS.

(Yes, I’ve beaten this drum.) Like this one: Birds Eye has partnered with a couple of health-related nonprofits to help end childhood obesity in the U.S. The program is called.

Text a shortcode to subscribe, and you get a couple of text messages each with recipies, nutritional information and tips about making healthy food choices. At London’s Charing Cross underground station, the police have a new campaign that solicits information about non-emergency incidents—by text. You simply send a text to a shortcode. The police gather more information this way, and can probably save time on typing up reports with the old copy-and-paste. Another at Charing Cross: Brook Street recruitment firm is using SMS as a simple and immediatecall-to-action to managers looking to attract and retain new staff.

SMS makes for an ideal CTA, as you can quickly fire off a text. Very unlikely you would stop and send an email on your way through the station, let alone download an app. Bank of Queensland customers find the nearest ATMs and branches by their location (as city and state or postcode) to a shortcode. The bank will return an SMS with the location of up to four ATMs or branches. Earlier this year, Citi introduced a that extends how much banking you can do outside of a branch. It has biometric identity authentication, an online banking connection, video conferencing and—SMS!

SMS still provides the universal mobile service to Citi account holders, which the bank uses for sending information, alerts, dispute resolution notices and one-time-pin for online banking. And one more train station example: Colgate has been running a promotion for its new electric toothbrushes at London’s Waterloo Station.

In addition to the electronic billboard ads, radio spots and newspaper inserts, Colgate used SMS so you could set a reminder to go to the station at the right time. Although by the huge queues I saw that day, perhaps it was a little too successful! What Can Mobile App Development Teams Learn from a Spinning Top? For many, it is hard to imagine a world when simpler, non-electronic, toys were the primary options for fun. How quickly we seem to forget! IT and C-level executives might be surprised to discover there are three business lessons that can still be learned from a simple toy like a spinning top. Here are three for consideration: Simplicity can increase durability Sometimes, the simplest concept can stand the test of time.

Archeologists have found spinning tops that date back over five thousand years. And, here’s the most amazing stat: they still function today exactly as they did then. What about your mobile app? How will it stand the test of time? Is anyone taking bets that a cell phone, or any of its apps, will still be working five thousand years from now?

How about one year from now? Tipping points Without going into the physics of how a spinning top works, suffice it say that once a top is correctly spinning on a smooth surface, it will continue to do for as long as its’ spinning inertia can maintain a balance. Once inertia begins to slow, balance will falter, and the spinning top will revert back into being just an inert object. Eerily this description fits mobility software programs, too.

Finding the right balance in software features and functions, without making it overloaded, may make the critical difference in the lifespan of the product. There are tipping points when all software programs stop being useful. And, an unused software app is another definition of an inert object. Have you identified your tipping points? User interfaces A complex concept implies complex user interfaces. Plus, a complex concept has more points of failure than a simpler concept.

A spinning top is an intuitive product. The very design of a top invites the user to give it a spin with a flick of the wrist. When users look at your mobile app, what is appealing and inviting about it? Is it intuitive or intimidating?

Are users ready to give it a flick or a swipe to get started? Final thoughts What are your best case hopes and aspirations for the life of your mobile app?

More than two years? More than five years?? Perhaps emulating the lessons learned from a spinning top will help produce positive influences on your mobile application projects. Source from: 30.

SAP ERP Simulation Game by BATON Simulations The SAP ERP application has the potential to transform your business. But getting the best possible return on your software investment ultimately comes down to several things including the strength of your implementation team, business user ability, and how well managers and executives understand what the software can do. The likelihood of a successful implementation increases dramatically when your team is committed and excited to learn how to use the software.

But instilling this excitement can be a challenge. Research shows that traditional training methods focusing on transactions and keystrokes aren't as effective as experiential, hands-on learning practices. To take full advantage of the power and potential of SAP software, your business needs to be engaged and invested in the learning process. The SAP ERP Simulation game by Baton Simulations offers your organization a proven way to get new users to accept SAP software. It also helps existing users increase their understanding of the software so they can use it more effectively and collaboratively in your organization. As a hands-on learning game, SAP ERP Simulation is played in a classroom setting, in teams, on a live instance of the SAP ERP application.

The interactivity of the game encourages your learners to work together to achieve true collaboration during the execution of enterprise business processes. Here is how the game is played: Throughout the game, participants interact in the software to demonstrate how their individual contributions impact other parts of the business. For example, one team member prepares a forecast and orders raw materials. To do so, they access screens and transactions relating to independent requirements and material resource planning. At the same time, another team member adjusts pricing and makes marketing decisions based on sales data, and market intelligence, that is being monitored by a third team member.

Your learners will see that the best results require not only great individual execution, but great teamwork. During the course of the game, team members will: • interact with suppliers and customers by sending and receiving orders, • manage inventory levels, • document the production and delivery of products, • manage cash flow, and • make decisions about marketing, plant, and distribution-system improvements.

The game accelerates participants along the learning curve. It also generates tremendous motivation among business users, executives, and project teams. After playing the game, participants report: • significantly increased skill, • more positive attitudes, • greater confidence in their ability to master software transactions and reports, and • stronger belief in the potential of SAP software to add value to the enterprise. In short, they are ready to go -- with enthusiasm, understanding, and positive expectations.

Positive attitudes and adequate preparation can reduce your organization's training and support costs while shortening the ramp-up time for new users. The game provides deep learning embedded in engaged doing -- with results that help your organization achieve the best possible return on your software investment. Ultimately, SAP ERP Simulation adds value to your enterprise by helping business users leverage the power and potential of SAP software. PRODUCT DETAILS SAP ERP Simulation is a classroom based competitive business game, played in a live SAP environment. It provides a compelling way for the learning 2.0 generation members of your workforce to harness the power of SAP solutions in their day to day activities.

SAP ERP Simulation can be purchased two ways: • As a six (6) month subscription, allowing multiple members of your organization to access and utilize the system at their convenience. • As a single-day game delivered at the customer location. LINK TO THE US SAP ERP SIMULATION BY BATON SIMULATIONS WEBPAGE: LINK TO OVERVIEW DEMONSTRATION OF SAP ERP SIMULATION BY BATON SIMULATIONS: LINK TO QUICK OVERVIEW OF SAP ERP SIMULATION BY BATON SIMULATIONS: Src: 31. SAP BW 7.30: Performance Improvements in Master-Data related scenarios and DTP Processing. Data loads into a Master Data bearing Characteristic require database look-ups to find out if records exist on the database with the same key as the ones being loaded.

In releases prior to SAP BW 7.3, this operation was performed record-wise, i.e. For every record in the data-package, a SELECT was executed on the database table(s). Obviously, this resulted in a lot of communication overhead between the SAP Application Server and the Database Server, thereby slowing the Master Data loads down. The effect is pronounced on data loads involving large data volumes. The issue of overhead between the SAP Application Server and the Database Server has now been addressed by performing a mass-lookup on the database so that all records in the data-package are looked-up in one attempt. Depending on the DB platform it can bring up-to 50% gain in load runtimes. The ‘Insert-Only Flag’ for Master Data Loads • Starting NW 7.30 SP03, this flag will be renamed to – “New Records Only”.

The renaming has been done to align with a similar feature supported by activation of DSO data. Deleting MasterData in BW has always been a performance intensive operation.

The reason being that before any MasterData can be physically deleted, the entire system (Transaction Data, Master Data, and Hierarchies etc) is scanned for usages. Therefore, if a lot of MasterData is to be deleted, it takes some time to establish the data that is delete-able (i.e., has no usages) and data that is not (has usages). In addition, with the classical MasterData Deletion involving large data volumes, users sometimes ran into memory overflow dumps. Quite often there are scenarios in SAP BW where data being loaded from a source to a target needs to be augmented with information that is looked up from Masterdata of Infoobjects.

For instance - loading sales data from a source that contains data on Material level to a DataTarget where queries require the sales data to be aggregated by Material Group. In such cases, the Master Data Lookup rule-type in Transformations is used to determine the Material Group for any given Material (given that MaterialGroup is an attribute of Material). Although the performance of the Masterdata Lookup rule-type has been optimized in earlier versions of BW (starting BW 7.0), there is an alternative to this rule-type in BW 7.30.

Now, navigational attributes of Infoobjects are available as source fields in Transformations. The benefits of this feature are two-pronged. • The fact that the data from the navigational attributes is available as part of the source structure allows the data to be used in custom logic in Transformations (example: Start Routines). • Secondly, the data from the navigational attributes is read by performing database joins with the corresponding Masterdata tables during extraction.

This helps in improving the performance of scenarios where a lot of look-ups are needed and/or a lot of data is to be looked-up. ATTYP Material Category INVKZ Indicator: Movement type is physical inventory ITEM_CAT EA Retail BW Extr. Enhancement: Item Type KORR Indicator: Movement type is inventory correction MATST Structured material REC_TYPE EA Retail BW Extr. Enhancement: Data Record Type RETKZ Indicator: Return from call SAUTO Item automatically created UMLKZ Indicator: Movement type is stock transfer UMMATKZ Transfer Posting Material to Material VKSTA Value at sales prices excluding value-added tax VKSTT Value at Sales Prices Including Value-Added Tax VLFKZ Plant category WAERSST Currency Key.

Could you please someone help me how to reduce the completion time of the create index variant. Sol: Hope you are dropping indices before you load to Cube. This will improve loading performance. Creating secondary indexes on Cubes is not mandatory unless your reporting performance is decreased. If there is no such issue, you can remove dropping/building indexes from your process chain. Because, eventually data will increase in the Cube. It will be very difficult to maintain in terms of roll up of aggregates, compression etc.

These indexes are just secondary indexes on the cube. Your actual performance of the cube depends on the design of your dimensions/chars. You can improve its performance by Aggregates, Compression, partitioning etc.

Creating indexes is always time consuming as your cube is full load based. The data will be increasing like hell. It is not mandatory to drop the index before loading the data to the cube. If your daily delta load has more than 10% of the data(not a hard and fast rule) in the cube then it makes sense to drop the index and recreate it after load. Also if the cube is not compressed then create index time will be more as each request forms a segment and index creation happens on each of these segment. So do the following for your issue.

1) Try loading without dropping index 2) if you get DBIF_RSQL_SQL_ERROR during load which happens due to index not being dropped then go for drop and recreate index 3) compress the cube if you dropping and recreating index. If data size is high the creation of indexes is going to take time.

Generally it is advisable to delete and rebuild the indexes during the time of data load, benefits it will faster the data load, as loading data on already index table is slower than that of without indexes. Keep the indexes as it is and load data here you are compromising on data load time on the cost of saving index creation time. One more thing to consider in above case, dead lock issue may arise if your DTP is using more than 1 batch process, so make it 1 to avoid oracle dead lock issue during loading (this way you are further increasing the data load time).

You have to make decision based on the scenario/time. SAP BI Questions and Answers I Some free Basisc Questions and Answers in SAP BW / SAP BI Hope this helps all users, these are some basic questions in SAP BW, soon i will be posting some more Enjoy!! What are the advantages of an Extended star schema of? The star schema?

• Uses generated numeric keys and aggregates in its own tables for faster access. • Uses an external hierarehy. • Supports multiple languages. • Contains master data common to all cubes. • Supports slowly changing dimensions.

What is the 'myself data mart'? A BW system feeding data to itself be caged the myself data mart. It is created automatically and uses ALE For data transfer. How many dimensions are there in a cube?

There are a total of 16 dimensions in a cube. Of these16, SAP and these are time, unit and request predefine 3. This leaves the customer with 13 dimensions. What is an aggregate? Aggregates are mini cubes. They are used to improve performance when executing queries. You can equate them to indexes on a table.

Aggregates are transparent to the user. What is the transaction for the Administrator workbench? Transaction RSA1 What is a calculated key figure? A calculated key figure is used to do complicated calculations on key figures such as mathematical functions, percentage functions and total functions.

For example, you can have a calculated key figure to calculate sales tax based on your sale price. What is the enhancement user exit for BEx reporting? RSROOOOl What is a characteristics variable? You can have dynamic input for characteristics using a characteristic variable. For example, if you are developing a sales report for a given product, you will define a variable for OMATERIAL.

What is a condition? If you want to filter on key figures or do a ranked analysis then you use a condition. For example, you can use a condition to report on the top 10 customers, or customers with more than a million dollars in annual sales.

What are the differences between OLAP and OLTP? OLAP Summarized data Read only Not Optimized Lot of historical data What is a star schema? OLTP Detailed Read writes Optimized for data applications Less historical data A fact table at the centre and surrounded (linked) by dimension tables What is a slowly changing dimension? A dimension containing characteristics whose value changes over a time period. For example, take an employee’s job title; this changes over a period of time as the employee moves through an organization.

This is called a slowly changing dimension. What are the advantages of an Extended star schema of? The star schema? • Use of generated keys (numeric) for faster access • External hierarchy • Support for multiple languages • Basic Concepts • Master data is common to all cubes • Supports slowly changing dimensions • Aggregates in its own tables which allows for faster access What is the namespace for BW? All SAP objects start with 0.

The customer namespace is A - Z. All tables begin with /BIO for SAP and /BIC for customers; All generated objects start with 1-8 (like export data source). The prefix 9A is used in APO. What is an lnfoObject?

InfoObjects are business objects e.g. Customer, product. They are divided into characteristics and key figures. Characteristics are evaluation objects such as customer and key figures are measurable objects such as sales quantity.

Characteristics also include special objects like unit and time. What are time dependent texts I attribute of characteristics? If text (for example a name of a product or person) or if an attribute changes over time then these must be marked as time dependent. Can you create your own time characteristics? No What is meant by Alpha conversion?

Alpha conversion is used to store data consistently. It does this by storing numeric values prefixed with 'O' eg. If you have defined a material as length 6 (of type N Nums) then material number 1 is stored as 000001 but displayed as 1; this removes inconsistencies between 0 1 vs. What is the alpha check execution program? This is used to check consistency f () r BW 2.x before upgrading the system to 3.x. It is RSMDCNVEXIT What are the attributes only flag?

If this flag is set, no master data is stored. This is only used as an attribute for other characteristics, for example comments on an Accounts Receivable document. What are the data types allowed for key 'figures?.

Time What are the aggregation options for key figures? If you are defining prices then you may want to set 'no aggregation' or you can define max, min, sum. You can also define exception aggregation like first, last etc. This is helpful in getting a headcount eg.

If you define a monthly inventory count key figure you want the Count as of the last day of the previous month. What is the maximum nurnber of key 'figures you can have in an lnfoCube? 233 What is the maximum number of characteristics you can have per dimension? 248 What is a SID table and what are its advantages?

The SID table (Surrogate ID table) is the interface between master data and the dimension tables. Advantages include: Using 8 byte integer values as indexes for faster access Master data is independent of InfoCubes Supports multiple languages Supports slowly changing dimensions What is the transfer routine of the lnfoobject? It is like a start routine; this is independent of the data source and valid for all transfer routines; you can use this to define global data and global checks. What is the DIM ID?

These are Dimensional IDS. Dim ID’s link dimensions To the fact table. It is an 8-byte integer like SID.

What is a table partition? By partitioning we split the table into smaller tables, which is transparent to the application. This improves performance (when reading as well as deleting data).

SAP uses fact table partitioning to improve performance. Note that you can only partition on OCALMONTH or OFISCPER.

Remember that the partition is created only in the E fact table; the F fact table is partitioned by Request Number as a default. Advantages of a partition: ' Makes use of parallel process ' Allows a smaller set of data to be read ' Allows fast deletion How many extra partitions are created and why? Can you partition a cube with data? Usually 2 extra partitions are created to accommodate data before the beginning period and one after the end of partitioning period. No, you cannot partition a cube with data.

A cube must be empty to partition it. One work around is to make a copy of the cube A to cube B and then to export data from A to B using export data source. Then empty cube A, create partition on A, re-import data from B and delete cube B. Note that this is going to change in Netweaver 2004S (Or BW 7) What is a source system? Any system that is sending data to BW like R/3, flat file, oracle database or a non-SAP systems. What is a data source and what is an lnfoSource? Data source: The source that is sending data to a particular InfoSource on BW for example, we have an OCUSTOMER_ATTR data source to supply attributes to OCUSTOMER from R/3.

Info Source: Group of logically related objects. For example, the OCUSTOMER Info Source will contain data related to customer and attributes like customer number, address, phone no, etc. What are tile 4 types of lnfoSources?. Transactional., Attributes.

Text, Hierarchy What is a communication structure? Is an independent structure created from an InfoSource? It is independent of the source system I data source. What at are transfer rules and what is global transferring rule? Transfer rules: The transformation rules for data from the source system to the InfoSource I communication structure. These are used to clean up the data from source system. For example when you load customer data from flat file, you can convert the name to upper case using a transfer rule.

Global Transfer Rule: This is a transfer routine (ABAP) defined at the InfoObject level. This is common for all source systems. What is the process of replication and what menu path would you use to perform it? This copies data source structures from R/3 to BW For example, assume that you added a new data source in R/3. This will not be visible in the BW system until you replicate it. You replicate using the transaction RSA 1 -+Source System -+Right click on the system -+Replicate.

You can also replicate at an info area level. What is the update rule?

The update rule defines the transformation of data from the communication structure to the data targets. This is independent of the source systems I data sources. For example, you can use update rule to globally change data independent of the source System. What are the options in update rules?, One to one moves for InfoObject value, Constant, Lookup for master data attribute value, Formula, Routine (ABAP), Initial value What are the special conversions for time in update rules? Time dimensions are automatically converted.

For example, if the cube contains calendar month and your transfer structure contains date, the date to calendar month is converted automaticall What is the start routine? The first step in the update process is to call start routine. Use this to fill global variables to be used in update routines.

For example, you can define global values to be used by the update routines. It is also the first step in the Transformation process before the Transfer rules. What is the conversion routine for units and currencies in the update rule? (?) Using this option you can write ABAP code for unit I Currency conversion. If you enable this flag then unit Of measure of the key figure appears in the ABAP code as an additional parameter. For example, you can use this to convert quantity in pounds to quantity in kilograms.

How do you create the 'myself data mart'? The BW system feeding data to itself is called the myself data mart. It is created automatically and uses ALE for data transfer • Right click and create the export data source for the ODS/cube or PSA. • In the target system replicate the data source • Create transfer rules and update rules • Create info package to load 36. Safety Upper Limit and lower limit SAP BW Generic Datasource. Experience SAP Fiori 42. Standard SAP SYCLO deficiencies While working and implementing SAP Syclo in one of the large Oil and Gas company an assessment was conducted to evaluate the efficiencies of using the Mobility solution for Materials Management determined that the current solution, as built to represent a common interface between SAP and mobility, did not provide the business the efficiencies expected from a mobile material management solution.

The amount of data analysis and UI interaction deflected from the execution of the task at hand. To overcome this we need to create a Usability Enhancement layer to be placed on top of the original mobile application without disrupting the business process already developed.

This independent layer uses the same fetches (data load), rules (business validation), and transactions (data sync) while simplifying the user experience to allow the field user to focus on the execution of the task being performed providing the efficiencies expected from a mobile solution. The efficiencies gained by using a mobile device to execute the materials management functions are tied directly to the user’s ability to scan barcode labels in order to locate materials loaded on the device and navigate to the data input screens. Scan-enabled list screens are used to provide these efficiencies. These screens have a limited viewing time span, if any, and are conduits to the ultimate goal of the task, data input to sync back to SAP. GOOGLE PROJECT GLASS AND SAP Google Glass (styled 'GL Λ SS') is a wearable compute with an optical head-moutned display (OHMD) developed by Google with the mission of producing a mass-market computer.

Google Glass displays information in a smartphone like hands-free format, that can interact with the Internet. Now SAP and Vuzix have teamed up to create augmented reality glasses,presenting it for manufacturers, logistics companies, and service technicians as well. The smart glasses can connect with a smartphone to access data which is displayed on a screen in front of the person wearning the Google Glass.

The person with its glass on can control the device through voice commands. For example, smart glasses can guide warehouse workers to the products on their pick lists. At the shelf they can scan the barcode to make sure they have the right item and confirm in the system that it has been picked. Forklift drivers can use the glasses to request help or instructions on how to resolve a technical problem, for instance.

SAP products that can be used with smart glasses are: SAP Inventory Manager, SAP Work Manager, SAP CRM Service Manager, SAP Rounds Manager, and SAP Machine-to-Machine Platform. According to Vuzix, the smart glasses can run on iOS and Android. SAP MM - STEPS TO CREATE A COST CENTER To create a cost center Use the Transactions KS01, KS02 and KS03 for Create, Change and Display Cost Centre Master data. A Cost Centre in SAP is created to collect costs fo a particular area A Cost Centre master record must contain the following information: basic data, a long text description, and classic system data entries made in the Address and Communications dialog boxes. Menu Path: Accounting >>Controlling >>Cost Center Accounting >>Master Data >>Cost Center >>Individual Processing >Create (KS01) Following initial screen will appear for creating a Cost Centre Enter the relevant entries in the following fields: (Definitions taken from Standard SAP) • Cost Centre - Enter a name for the cost centre* • Valid From - Enter the desired start date of when you want the Cost Centre to be used* • Valid To - Enter the date to which the Cost Centre will prevent further postings to it. • Reference: This section means you can enter another cost centre from any selected controlling area and the new cost centre being created will automatically have all the same attributes as the one you choose to copy. • Cost Centre: Enter the Cost Centre number which you would like similar attributes for your new Cost Centre.

You can always change the copied attributes once the KS01 transaction is executed but it saves time to have most of the fields automatically filled. • Controlling Area: Select the controlling area to which the cost centre you wish to copy from is assigned to. Now once all data has been entered then Click the Master Data button ( ) or press Enter 45. SAP Fiori - Collection of SAP Mobile Apps. This report will take delivery document number and delivery date from user and fetches details from delivery table and fetches corresponding sales order details and billing details and displays sales order details with ALV list. Report sales_order_report.

*'Table declarations........ TABLES: likp. ' SD Document: Delivery Header Data *'Selection Screen Elements....... SELECT-OPTIONS: s_deldoc FOR likp-vbeln, ' Delivery s_dldate FOR likp-lfdat. ' Delivery Date *'--------------------------------------------------------------------------------------* * Type declaration of the structure to hold specified delivery header data * *'---------------------------------------------------------------------------------------* TYPES: BEGIN OF type_s_likp, vbeln TYPE likp-vbeln, ' Delivery lfdat TYPE likp-lfdat, ' Delivery Date kunnr TYPE likp-kunnr, ' Ship-to party END OF type_s_likp.

The links used in this tutorial follow the description: This video has come to help you as SAP developer/consultant or student to install SAP NetWeaver 7.3 64 Bit for private usage (a product series known also as 'Minisap'). Usually installing and configuring such a development environment is the business of the IT-department (Administrator) in a company. Installing such a software needs time and patience.

Please note that according to the installation guide of SAP for this T R I A L edition a 'Windows server 2008' operating system is a must - so windows 10 for example is not an option. Finally, please note that the provided links are tested in Germany (should work also world wide) and valid in April 2016 (and many years to come hopefully). SAP and Microsoft may or may not change the way you download their software referenced in this tutorial. In such case you will need to google (: Links: Download Windows server 2008 Download JDK 1.5.

Register in SAP Website Search 'netweaver' and downnload it Request a license key.