- Get high risk and high value requirements first
- Constant user feedback and engagement
- Early cohesive core architecture
- Test early, often, and realistically
- Apply use cases where needed
- Do some visual modeling with UML
- Manage requirements and scope creep
- Manage change requests and configuration
Thursday 21 April 2022
Unified Process (UP) Best Practices
How to persist the selected value of the select box after form submit?
if ($_POST['submit']) {
if ($_POST['srf'] != "") {
$srf = $_POST['srf'];
} else {
$srfErr = "This field is required.";
}
}
?>
<form method="post">
<select name="srf" id="srf" class="form-control">
<option value="">Select</option>
<option <?php if (isset($srf) && $srf=="1") echo "selected";?>>1</option>
<option <?php if (isset($srf) && $srf=="2") echo "selected";?>>2</option>
<option <?php if (isset($srf) && $srf=="3") echo "selected";?>>3</option>
</select> <?php if($srfErr!= ""){ ?>
<p><b><?php echo $srfErr; ?></b></p>
</div>
<?php }?>
<input type="submit" name="submit" value="submit" />
</form>
Unified Process Phases
Inception
- Inception is not a requirements phase; rather a feasibility phase, where just enough investigation is done to support a decision to continue or stop. –
- The life-cycle objectives of the project are stated, so that the needs of every stakeholder are considered. Scope and boundary conditions, acceptance criteria and some requirements are established.
- Approximate vision, business case, scope, vague estimates.
Inception - Activities
- Formulate the scope of the project: Needs of every stakeholder, scope, boundary conditions and acceptance criteria established.
- Plan and prepare the business case: Define risk mitigation strategy, develop an initial project plan and identify known cost, schedule, and profitability trade-offs.
- Synthesize candidate architecture: Candidate architecture is picked from various potential architectures
- Prepare the project environment
Inception - Exit criteria
- An initial business case containing at least a clear formulation of the product vision - the core requirements - in terms of functionality, scope, performance, capacity, technology base.
- Success criteria (example: revenue projection).
- An initial risk assessment.
- An estimate of the resources required to complete the elaboration phase.
Elaboration
- An analysis is done to determine the risks, stability of vision of what the product is to become, stability of architecture and expenditure of resources.
- Refined vision, iterative implementation of core architecture, resolution of high risks, identification of most requirements and scope, more realistic estimates
- The products and artifacts described in the exit criteria of the previous phase.
- The plan approved by the project management, and funding authority, and the resources required for the elaboration phase have been allocated
- Define the architecture: Project plan is defined. The process, infrastructure and development environment are described.
- Validate the architecture.
- Baseline the architecture: To provide a stable basis for the bulk of the design and implementation effort in the construction phase.
- A detailed software development plan, with an updated risk assessment, a management plan, a staffing plan, a phase plan showing the number and contents of the iteration , an iteration plan, and a test plan
- The development environment and other tools
- A baseline vision, in the form of a set of evaluation criteria for the final product.
- A domain analysis model, sufficient to be able to call the corresponding architecture ‘complete’.
- An executable architecture baseline.
Construction
- The Construction phase is a manufacturing process. It emphasizes managing resources and controlling operations to optimize costs, schedules and quality. This phase is broken into several iterations.
- Iterative implementation of the remaining lower risk and easier elements, and preparation for deployment.
- The product and artifacts of the previous iteration. The iteration plan must state the iteration specific goals
- Risks being mitigated during this iteration.
- Defects being fixed during the iteration.
- Develop and test components: Components required satisfying the use cases, scenarios, and other functionality for the iteration are built. Unit and integration tests are done on Components.
- Manage resources and control process.
- Assess the iteration: Satisfaction of the goal of iteration is determined.
- The same products and artifacts, updated, plus
- A release description document, which captures the results of an iteration
- Test cases and results of the tests conducted on the products
- An iteration plan, detailing the next iteration
- Objective measurable evaluation criteria for assessing the results of the next iteration(s).
Transition
- The transition phase is the phase where the product is put in the hands of its end users. It involves issues of marketing, packaging, installing, configuring, supporting the user. community, making corrections, etc.
- Beta tests, deployment.
- The product and artifacts of the previous iteration, and in particular a software product sufficiently mature to be put into the hands of its users.
- Test the product deliverable in a customer environment.
- Fine tune the product based upon customer feedback
- Deliver the final product to the end user
- Finalize end-user support material.
- An update of some of the previous documents, as necessary, the plan being replaced by a “post-mortem” analysis of the performance of the project relative to its original and revised success criteria;
- A brief inventory of the organization’s new assets as a result this cycle.
Wednesday 20 April 2022
ER [Entity–relationship] Model
ER Model is a popular high-level (conceptual) data model. It is an approach to designing Semantic Conceptual schema of a Database. ER model allows us to describe the data involved in a real-world environment in terms of objects and their relationships, which are widely used in design of database. ER model provides preliminary concepts or idea about the data representation which is later modified to achieve final detailed design.
Important concepts/notions used in ER modeling are-
Entity
It is an object in real-world or some idea or concept which can be distinguished from other objects.
Ex.: person, school, class, department, weather, salary, temperature etc.
Entity has independent existence.
Entity type
Each entity belongs to an Entity type that defines the structure.
Entity Set
It is a Collection of similar objects.
Attribute
Reflects a property of an object or entity. We have following types of attributes.
> Simple attribute
> Composite attribute
> Single valued attribute
> Multi-valued attribute
> Derived attribute
> Stored attribute
Key
Is an Attribute of an entity type whose value can uniquely identify an entity in a set.
Relationship
The association between entities is known as relationship.
Domain of an attribute
The set of possible values is known as domain of an attribute
The Agile Principles
1. Satisfy the customer through early and continuous delivery of valuable software.
2. Welcome changing requirements, even late in development.
3. Deliver working software frequently, from a couple of weeks to a couple of months, with a preference to the shorter time scale.
4. Business people and developers must work together daily throughout the project.
5. Build projects around motivated individuals.
6. The most efficient and effective method of conveying info.
7. Working software is the primary measure of progress.
8. Agile processes promote sustainable development.
9. The sponsors, developers, and users should be able to maintain a constant pace indefinitely.
10. Continuous attention to technical excellence and good design enhances agility.
11. Simplicity – the art of maximizing the amount of work NOT done is essential.
12. The best architectures, requirements, and designs emerge from self- organizing teams.
13. At regular intervals, the team reflects on how to become more effective, then tunes and adjusts its behavior accordingly.
Data Model
Data Model is a collection of concepts that can be used to describe the structure of the database. Structure means data types, relationships, constraints etc. DBMS allows a user to define the data to be stored in terms of a data model.
1. High-level data models
2. Low-level data models
3. Representational or Implementation data models
High-level Data Models
Use set of concepts to describe the database, where the descriptions are close to user views. High-level data models are also known as conceptual models. In conceptual data modeling we use concepts like – entity, attributes, relationship etc.
Low-level Data Models
Give details about how the data is stored in a computers (storage level details).
Representational/Implementation Data Models
This is in between high-level and low-level data models. Here we represent the concepts described in conceptual model using a specific structures like, networks, objects, tables, trees etc. Ex: Relational Model, NW Model, Hierarchical Model, Object Model, Object relational model etc.
Relational Model
The central data description construct in this model is a relation, which can be thought of as a set of records.
Schema
Description of data in terms of a data model is called a schema. A relation schema specifies the name of the relation, fields, type etc.
eg:. Student (sid: string; name: string; age: integer) every row follows the schema of the relation.
The following are some important representational data models (DBMS Specific)
1. Network Model
Though the basic structure is a record, the relationships are captured using links. The database can be seen as an arbitrary network of records connected by links. Ex.: GE’s Integrated Data store (IDS), in Early 1960s
2. Hierarchical Model
The records containing data are organized as a collection of trees. Ex.: IBMs IMS (Information Management System), in late 1960s
3. Relational Model (early 1970s)
Data & relationships are captured as tables & keys. Ex.: Oracle, IBMs DB2, MySQL, Informix, Sybase, MS Access, Ingress, MySQL etc. The basic storage structure is a record.
4. Object Data Model
Objects created through object–oriented programs can be stored in database. Ex.: Object Store
5. Object Relational Model
Objects can be stores in tables. Ex.: Oracle, Informix
Database Schema
Description of a database is called as database Schema.
Three-Schema Architecture:
A database can be described using three different levels of abstractions. Description at each level can be defined by a schema. For each abstraction we focus on one of the specific issues such as user views, concepts, storage etc.
1. External schema: Used to describe the database at external level. Also described in terms of the data model of that DBMS. This allows data access to be customized at the level of individual users/groups/applications. Any external schema has one or more views and relations from the conceptual schema. This schema design is guided by end user requirements.
2. Conceptual schema (logical schema) Describes the stored data in terms of the data model specific to that DBMS. In RDBMS conceptual schema describes all relations that are stored in the database. Arriving at good choice of relations, fields and constraints is known as conceptual database design.
3. Physical schema: Describes the physical storage strategy for the database.
Tuesday 19 April 2022
Fatal error: Cannot redeclare Function() in PhP
Solution
1. Don't declare a function inside a loop. Declare before them.
You should include the file (wherein that function exists) only once. So instead of
include ("function.php");
use include_once("function.php");
Monday 18 April 2022
Integrate TinyMCE Editor in PhP
1. Download the latest version of TinyMCE SDK
https://www.tiny.cloud/get-tiny/
2. Extract the folder [tinymce] and put in your application root folder.
eg: C:\xampp\htdocs\tinymcedemo\tinymce [tinymcedemo is the root folder]
3. Create a php file "test.php" and place the below code
<html>
<head>
<script src="tinymce/tinymce.min.js"></script>
<script type="text/javascript">
tinymce.init({
selector: '#mytextarea'
});
</script>
</head>
<body>
<h1>TinyMCE </h1>
<form method="post">
<textarea id="mytextarea"></textarea>
</form>
</body>
</html>
How to disable phpinfo() in a hosting environment?
Login to server WHM as a root user
Edit the php.ini file. Add below lineFriday 15 April 2022
Fatal error: Cannot pass parameter by reference in PHP
Fatal error: Cannot pass parameter by reference in PHP.
The error is with 'test'
in the bind_param
call.
All parameters to bind_param
must be passed by reference. A string is a primitive value, and cannot be passed by reference.
You can fix this by creating a variable and passing that as a parameter instead:
$testvar="test";
Enable or disable the "Powered by TinyMCE" branding.
Tinymce branding property allow you to enable or disable the "Powered by TinyMCE" branding.
tinymce.init({
selector: 'textarea',
branding: false
});
Thursday 14 November 2019
Adding Excel export button hides/removes the Page Length Dropdown in Data Tables
"dom": '<"top"Bf>rt<"bottom"lip><"clear">'
Complete Code
<script>
$(document).ready(function () {
$('#example').DataTable( {
"processing": true,
"lengthMenu": [[25, 50, -1], [ 25, 50, "All"]],
"dom": '<"top"Bf>rt<"bottom"lip><"clear">',
"serverSide": true,
"ajax": "server_processing.php"
} );
});
</script>
Friday 27 May 2016
Cloud Computing
3-4-5 Rule
3 : Services
1. IaaS (Infrastructure as a Service)
2. PaaS (Platform as a Service)
3. SaaS (Software as a Service)
Cloud Service Models
Utility Computing
- Utility computing is the packaging of computing resources, such as computation, storage and applications, as a metered service similar to traditional public utility (such as electricity, water, natural gas, or the telephone network).
- This model has the advantage of a low or no initial cost to acquire computer resources; instead, computational resources are essentially rented
- You get connected to the utility companies’ “public” infrastructure
- You get these utility services on‐demand
- And you pay‐as‐you use (metered service)
- Cloud computing is the most recent technology innovation which has made utility computing a reality!
Distributed Computing
- A distributed computing system is basically a collection of processors interconnected by a communication network in which each processor has its own local memory and other peripherals, and the communication between any two processors of the system takes place by message passing over the communication network
- Loosely coupled systems
- Examples: Cluster, Grid, P2P, Cloud computing, IOT
- Uses middleware, which enables computers to coordinate their activities and to share the resources of the system.
- Single integrated computing facility from user perspective
- Can include heterogeneous computations where some nodes may perform a lot more computation, some perform very little computation and a few others may perform specialized functionality (like processing visual graphics)
- Using which efficient scalable programs can be designed so that independent processes are scheduled on different nodes and they communicate only occasionally to exchange results
- Cloud computing is also a specialized form of distributed computing, where distributed SaaS applications utilize thin clients (such as browsers) which offload computation to cloud-hosted servers (and services).
- Additionally,cloud-computing vendors providing (IaaSand PaaS) solutions may internally use distributed computing to provide highly scalable cost-effective infrastructure and platform.
Cluster Computing
- A computer cluster is a group of loosely or tightly coupled computers that work together closely so that in many respects it can be viewed as though it were a single computer
- Better performance and availability and more cost‐effectiveness over single computer with same capabilities
- Loosely / tightly coupled computers
- Centralized Job management & scheduling
- Coined in 1987
Grid Computing:
- Grid is a collection of a large number of loosely coupled, heterogeneous, and geographically dispersed systems in different administrative domains
- Generally owned by multiple organizations that is coordinated to allow them to solve a common problem
- Loosely coupled computers
- Distributed Job management & scheduling
- Originated (early 1990s)
- The key emphasis of grid computing was to enable sharing of computing resources or forming a pool of shared resources that can then be delivered to users.
- Focus of grid computing was limited to enabling shared use of resources with common protocols for access
- a particular emphasis was given to handle heterogeneous infrastructure-typically a university data center.
- "coordinated resource sharing and problem solving in dynamic, multi-institutional virtual organizations.” -Ian Foster & Steve Tucker -> "Anatomy of Grid”
- There are also very specific differences between a grid computing infrastructure and the features one should expect from a cloud computing infrastructure
- Grid (three-point checklist)?
- Co-ordinates resources that are not subjecttocentralized control
- Using standard, open, general purpose protocols and interfaces
- To deliver nontrivial quality of service
Advantages -Distributed computing
- Several applications are inherently distributed in nature and require distributed computing system for their realization
- In a distributed computing system, information generated by one of the users can be easily and efficiently shared by the users working at other nodes of the system .
- Sharing of software resources such as software libraries and databases as well as hardware resources such as printers, hard disks, etccan also be done in a very effective way among all the computers and the users of a single distributed computing system.
- It is possible to gradually extend the power and functionality of a distributed computing system by simply adding additional resources (both hardware and software) to the system as and when the need arises .
- Incremental growth is very attractive feature because for most existing and proposed applications it is practically impossible to predict future demands of the system.
- Addition of new resources to an existing system can be performed without significant disruption of the normal functioning of the system.
- The multiple processors of the distributed computing system can be utilized properly for providing shorter response times and higher throughput than a single processor centralized system.
- Another method often used in distributed computing systems for achieving better overall performance is to distribute the load more evenly among the multiple processors by moving the jobs from currently overloaded processors to lightly loaded ones
- Reliability refers to the degree of tolerance against errors and component failures in a system.
- A reliable system prevents loss of information even in the event of component failures
- An important aspect of reliability is availability, which refers to the fraction of time for which a system is available for use.
- A distributed computing system may have a pool of different types of computers, in which case the most appropriate one can be selected for processing a user’s job depending on the nature of the job.
- Better Price-Performance Ratio:
- With the rapidly increasing power and reduction in prize of the microprocessors, combined with the increasing speed of communication network, distributed computing systems potentially have a much better price performance ratio than a single large centralized system.
Ubiquitous/ Pervasive computing
- It is an advanced computing concept where computing is made to appear everywhere and anywhere.
- Ubiquitous computing can occur using any device, in any location, and in any format. A user interacts with the computer, which can exist in many different forms -laptop, tablets, terminals, phones, etc.
- Move beyond desktop machine
- Ex: digital audio players, radio-frequency identification tags, PDAs, smartphones, GPS, and interactive whiteboards
- Present or noticeable in every part of a thing or place
- Information processing engaged in every day’s activities and objects
Monday 5 May 2014
Installing bedtools on centOS
https://github.com/arq5x/bedtools2/releases
2. Save it to a local folder (eg: softwares folder)
3. Untar it using below command
# tar xvzf bedtools-2.19.1.tar.gz
4. You can find all the bedtools function in bedtools-2.19.1/bin folder
5. You have to set the PATH environmental variable for bedtools so that OS can locate the bedtools program, even if it is not in the current directory
Setting PATH environmental variable for bedtools
1. Open /etc/profile.d
2. Create a document and name it as bedtools.sh
3. Add the below line in bedtools.sh
export PATH = $PATH:/softwares/bedtools-2.19.1/bin
4. Save it and close bedtools.sh
5. If you want to load the environment variables within bedtools.sh without having to restart the machine, you can use the source command as in
# source bedtools.sh
Install SRA toolkit on CentOS
http://eutils.ncbi.nih.gov/Traces/sra/?view=software
2. For centos 64 bit, we have to download sratoolkit.2.3.5-2-centos_linux64.tar.gz
3. Save it to a local folder (eg: softwares folder)
4. Untar it using below command
# tar xvzf sratoolkit.2.3.5-2-centos_linux64.tar.gz
5. You can find all the toolkit functions in the sratoolkit.2.3.5-2-centos_linux64/bin folder
6. You have to set the PATH environmental variable for toolkit so that OS can locate the fastq-dump program, even if it is not in the current directory
Setting PATH environmental variable for SRA Toolkit
1. Open /etc/profile.d
2. Create a document and name it as fastq-dump.sh
3. Add the below line in fastq-dump.sh
export PATH = $PATH:/softwares/sratoolkit.2.3.5-2-centos_linux64/bin
4. Save it and close fastq-dump.sh
5. If you want to load the environment variables within fastq-dump.sh without having to restart the machine, you can use the source command as in
# source fastq-dump.sh
Thursday 24 April 2014
Add image/logo in browser tab for website
Monday 24 February 2014
Installing xampp 1.8.3 on cent os 6.4
2. Copy "xampp-linux-x64-1.8.3-3-installer.run" into Computer/Filesystem/opt folder
3. Open terminal from opt folder
4. Type below commands
[root@localhost opt]# chmod +x xampp-linux-x64-1.8.3-3-installer.run
[root@localhost opt]# ./xampp-linux-x64-1.8.3-3-installer.run
5. Window installation wizard will appear
6. Complete the installation process