Performance Management and Appraisal Guide
Performance Management and Appraisal Guide
Performance management in digital environments is the structured approach to tracking, analyzing, and improving employee output using technology-driven tools and data. It connects individual contributions to organizational goals through software platforms that automate feedback, measure progress, and inform decision-making. For organizations using management information systems, this process relies on real-time data collection, performance analytics, and integrated workflows to align teams and optimize productivity.
This guide explains how modern performance management operates within digital frameworks and why it’s critical for businesses leveraging online systems. You’ll learn to design appraisal processes that work with existing MIS infrastructure, use data to identify skill gaps, and create feedback cycles that adapt to remote or hybrid work models. Key topics include setting measurable objectives in digital workflows, automating performance tracking without compromising accuracy, and translating appraisal results into actionable training plans.
For students focused on management information systems, this knowledge directly applies to configuring platforms that handle performance data, ensuring compliance with privacy standards, and minimizing manual oversight. Effective performance management reduces operational inefficiencies, supports talent retention, and provides the metrics needed to justify resource allocation. Whether you’re implementing a new system or refining existing processes, these strategies help organizations maintain agility in competitive markets. The methods covered here prepare you to bridge human resource goals with technical capabilities, ensuring systems drive both employee growth and business results.
Foundations of Performance Management in Digital Systems
Performance management in online systems requires methods adapted to virtual workflows and distributed teams. You need systems that measure outcomes objectively while supporting remote collaboration. This section explains how to structure performance management processes using digital tools without relying on physical proximity or informal observation.
Defining Performance Management in Virtual Work Environments
Performance management in digital systems means tracking and improving work output through structured processes executed via online platforms. Unlike traditional office settings, you can’t observe employees directly or rely on casual check-ins. Instead, you use:
- Digital interaction records (messages, task updates, file edits)
- Automated progress tracking in project management tools
- Standardized metrics aligned with organizational objectives
Three characteristics define effective virtual performance management:
- Platform integration: All performance data aggregates in a central system
- Asynchronous coordination: Processes work across time zones without real-time meetings
- Objective benchmarks: Clear numeric targets replace subjective observations
You implement this through tools like OKR software, time-tracking dashboards, and behavior analytics in collaboration platforms. The system must automatically capture relevant data points while minimizing manual input from employees.
Key Components: Goal Setting, Feedback Loops, Data Collection
Goal Setting
Digital systems require quantifiable, platform-native goals that integrate directly with workflow tools. You set targets using:
SMART criteria adjusted for digital visibility:
- Specific: "Reduce form submission errors in CRM" vs. "Improve data quality"
- Measurable: Trackable through system metrics like error rates
- Achievable: Based on historical platform performance data
- Relevant: Aligned with system upgrade cycles
- Time-bound: Tied to software release schedules
Tool-specific targets:
- "Maintain 95% uptime on customer portal"
- "Resolve 80% of support tickets within SLA windows"
Use goal-setting features in platforms like Asana or Jira to automatically update progress as team members complete tasks.
Feedback Loops
Real-time feedback replaces annual reviews in digital performance management. Build these elements:
- Automated alerts when metrics deviate from targets
- Peer recognition systems embedded in collaboration tools
- Weekly check-ins via recorded video updates or written summaries
Prioritize actionable feedback tied to specific system events:
- "Last week’s server load spike caused response delays – adjust auto-scaling rules"
- "Tickets resolved via new knowledge base articles increased by 18% – expand documentation"
Use feedback tools that integrate with task management systems. For example, leave comments directly on GitHub pull requests or Figma design files.
Data Collection
Digital systems enable continuous performance monitoring through:
Passive data harvesting:
- Time spent active in work apps
- Output volume (lines of code, resolved tickets)
- System reliability metrics (downtime, error rates)
Active input channels:
- Self-reported task completion in Trello
- Skill assessments in LMS platforms
- Peer reviews via 360-degree feedback tools
Balance these data types to avoid overload:
- Quantitative: Use dashboards showing real-time metrics like Zendesk ticket stats
- Qualitative: Schedule monthly text-based retrospectives in Slack
- Comparative: Benchmark team performance against historical data in Tableau
Configure data collection to respect privacy while maintaining transparency. Clearly communicate what metrics you track and how they impact evaluations.
Implement these components using APIs that connect your HRIS, project management tools, and analytics platforms. For example, sync Salesforce KPIs with BambooHR performance records to automate reporting. Test different combinations of tools to find workflows that generate useful insights without creating administrative burdens.
Adjust your approach as teams adopt new technologies. If you implement a new chatbot platform, update goal-setting templates to include conversation resolution rates. When deploying process automation tools, add metrics for reduced manual intervention. Keep performance management systems as dynamic as the digital environments they monitor.
Aligning Performance Metrics with Business Objectives
Effective performance management in Online Management Information Systems requires aligning individual contributions with organizational outcomes. When metrics directly support business goals, you create a transparent framework that drives accountability, improves decision-making, and ensures resources focus on high-impact activities.
Setting SMART Goals for Remote Teams
Remote work environments demand clear, measurable objectives to maintain alignment across distributed teams. SMART goals—Specific, Measurable, Achievable, Relevant, and Time-bound—provide structure for defining expectations in roles involving system implementation, data analysis, or user support.
- Specific: Define outcomes tied to system performance or user needs. Example: "Increase database query response time by 15% within Q3" instead of "Improve system speed."
- Measurable: Use data points like error rates, uptime percentages, or task completion times. For instance, track
system downtime
in your monitoring dashboard. - Achievable: Balance ambition with technical constraints. If upgrading a legacy system, allocate realistic timelines for testing phases.
- Relevant: Connect goals to organizational priorities like cybersecurity or user experience. A developer might prioritize resolving authentication vulnerabilities over cosmetic interface updates.
- Time-bound: Set fixed deadlines for deliverables like API integrations or report generation cycles.
For remote teams, use collaborative platforms to document goals in shared workspaces. Schedule biweekly progress reviews to adjust targets based on emerging technical challenges or shifting business needs.
Linking KPIs to Strategic Priorities
Key Performance Indicators (KPIs) must reflect how daily tasks contribute to broader organizational strategies. Start by mapping department-level objectives to individual roles:
- If the company prioritizes data-driven decision-making, a data analyst’s KPIs could include:
95% accuracy
in monthly forecasting models- Reduction of data processing time by
20%
- For teams focused on system security, KPIs might track:
- Number of unresolved critical vulnerabilities per sprint
- Average response time to breach alerts
Use a cascading model:
- Identify top-level business objectives (e.g., "Expand cloud infrastructure capacity by 40%").
- Break these into departmental targets (IT: "Migrate 50 legacy databases to cloud servers").
- Define individual KPIs (Database Administrator: "Complete 10 secure migrations per month").
Update KPIs quarterly to reflect changes in organizational strategy, such as shifts from system maintenance to innovation-driven projects.
Balancing Quantitative and Qualitative Metrics
While quantitative metrics dominate technical roles, qualitative assessments capture critical factors like problem-solving agility and collaboration.
Quantitative Examples:
System uptime
percentages- Tickets resolved per support agent
- Code deployment frequency
Qualitative Examples:
- Peer feedback on code readability
- User satisfaction scores from post-implementation surveys
- Manager evaluations of crisis management during system outages
To measure qualitative factors objectively:
- Use standardized rubrics for skills like communication or innovation.
- Conduct 360-degree reviews involving project managers and cross-functional peers.
- Analyze user feedback for recurring themes about system usability or support interactions.
For roles like MIS directors, weight metrics to reflect role requirements:
- Technical leads: 70% quantitative (project delivery speed), 30% qualitative (mentorship impact).
- User experience designers: 50% quantitative (task completion rates), 50% qualitative (user testing insights).
Integrate both metric types into performance dashboards. For example, pair average response time
(quantitative) with annotations explaining recurring technical bottlenecks (qualitative context).
Adjust your balance annually based on role evolution. A network administrator might initially focus on uptime metrics but later prioritize qualitative leadership metrics when managing junior staff.
Effective Appraisal Methods for Distributed Teams
Evaluating performance in distributed teams requires methods that account for remote collaboration, asynchronous work, and digital communication. Traditional annual reviews often fail to capture the dynamics of online environments. These three approaches address visibility gaps, align remote contributions with organizational goals, and maintain engagement across distances.
360-Degree Feedback in Virtual Settings
360-degree feedback gathers input from multiple sources: peers, subordinates, managers, and sometimes clients. In virtual teams, this method counteracts the lack of in-person observation by creating a comprehensive performance picture.
Use anonymous digital surveys to collect candid feedback across time zones. Platforms like Lattice or Culture Amp automate distribution and aggregate results. Focus questions on specific behaviors:
- How effectively does this person communicate in virtual meetings?
- Do they meet deadlines for cross-functional deliverables?
- How do they support remote teammates during challenges?
Set clear evaluation criteria tied to remote work competencies:
- Responsiveness in digital channels
- Quality of written communication
- Proactive collaboration in shared documents
Managers review feedback trends rather than isolated comments. Share summarized results with employees, highlighting strengths and growth areas. This method reduces bias from single-manager assessments and surfaces patterns invisible in distributed work.
OKR-Based Performance Reviews
Objectives and Key Results (OKRs) create objective benchmarks for remote employees. They align individual contributions with organizational priorities, making impact measurable across locations.
Define quarterly OKRs with two components:
- Objective: Qualitative goal (e.g., "Improve system uptime for cloud-based MIS")
- Key Results: 3-5 quantifiable metrics (e.g., "Reduce server downtime by 40%", "Resolve 90% of priority tickets within 2 hours")
During reviews, evaluate OKR completion rates and problem-solving approaches. Ask:
- Which key results were achieved or missed?
- What remote collaboration tools were used to progress objectives?
- How did the employee adapt strategies when facing time zone barriers?
Track OKRs in shared platforms like Asana or Monday.com. Employees update progress weekly, creating a transparent record for appraisal discussions. This method shifts focus from hours worked to outcomes delivered—critical for teams using flexible schedules.
Continuous Feedback Systems
Annual reviews struggle to address the pace of online work. Continuous feedback systems provide real-time insights through frequent, structured check-ins. Teams using this approach report significantly lower turnover.
Implement these components:
- Weekly peer recognition: A Slack channel where teammates share kudos for specific contributions
- Biweekly manager check-ins: 15-minute video calls discussing current projects and blockers
- Monthly skill development notes: Managers document observed improvements in key competencies
Use feedback tools that integrate with your existing tech stack:
- Microsoft Viva Goals for tracking professional development
- Bonusly for peer-to-peer recognition tied to company values
- 15Five for pulse surveys on workload and morale
Document all feedback in a central platform. During formal appraisals, reference this history to show progression or recurring issues. Continuous data prevents "recency bias" and gives remote employees regular opportunities to adjust their work habits.
Focus on observable behaviors, not subjective traits. Instead of "Be more communicative," say "Share daily status updates in the project channel by 10 AM local time." This specificity helps remote workers understand expectations without in-person oversight.
Train managers to deliver feedback effectively in virtual settings:
- Use screen sharing to discuss performance metrics visually
- Record video summaries of appraisal meetings for later reference
- Follow up written feedback with a live discussion to prevent misinterpretation
Prioritize feedback that improves system interactions. For example, note how an employee’s documentation practices in Confluence affect other team members’ ability to complete tasks. This reinforces how individual performance impacts the broader MIS infrastructure.
Technology Solutions for Performance Tracking
Modern performance management relies on digital tools to track, analyze, and optimize employee output. Cloud-based platforms automate data workflows, reduce administrative overhead, and provide real-time insights. This section breaks down the key tools and strategies for managing performance data effectively within online systems.
Top Platforms: BambooHR, Lattice, Workday
Cloud-based performance management systems are now the standard for most organizations. These platforms centralize employee data, simplify goal alignment, and streamline appraisal processes.
BambooHR focuses on user-friendly performance tracking for small to mid-sized businesses. Key features include:
- Customizable review templates for 360-degree feedback
- Goal-setting modules with progress visualization
- Integration with payroll and time-tracking systems
Lattice targets organizations prioritizing continuous feedback. Its standout tools include:
- Real-time engagement pulse surveys
- Career development planning with skill gap analysis
- OKR (Objectives and Key Results) tracking linked to individual KPIs
Workday serves enterprises requiring deep integration with ERP systems. Its performance tools offer:
- AI-driven talent analytics for succession planning
- Workforce capacity forecasting based on performance trends
- Compliance reporting for regulated industries
All three platforms support mobile access and provide role-based dashboards for employees, managers, and HR teams.
Automating Data Collection and Reporting
Manual data entry creates bottlenecks and increases error risks. Automation tools solve this by:
- Syncing data from multiple sources: Connect time-tracking software (
Toggl
,Harvest
), project management tools (Asana
,Jira
), and communication platforms (Slack
,Microsoft Teams
) via APIs. - Triggering feedback cycles: Set automatic reminders for check-ins or review submissions based on project milestones or calendar dates.
- Generating real-time reports: Use prebuilt templates to visualize team performance metrics like goal completion rates or competency assessments.
For example, a sales team’s CRM data can automatically populate performance dashboards, showing individual quotas versus actual sales. This eliminates manual data aggregation and ensures metrics stay current.
Critical automation features to prioritize:
- Customizable alert thresholds (e.g., flagging employees below productivity benchmarks)
- Exportable reports in formats compatible with BI tools like
Tableau
orPower BI
- Audit trails showing when data was added or modified
Security Considerations for Performance Data
Performance records contain sensitive information like salary histories, competency assessments, and disciplinary notes. Protecting this data requires:
Encryption standards:
- End-to-end encryption for data in transit (e.g., during peer reviews)
- AES-256 encryption for stored data, including archived reviews
Access controls:
- Role-based permissions (e.g., restrict view-only access for junior managers)
- Multi-factor authentication (MFA) for all user accounts
- IP address restrictions for system administrators
Compliance protocols:
- Anonymization tools for feedback used in aggregate analytics
- Automated retention policies to delete outdated records
- Audit logs tracking who accessed or modified employee files
Always verify that your vendor undergoes third-party security audits and holds certifications like ISO 27001 or SOC 2 Type II. Train managers to recognize phishing attempts targeting performance data, and establish clear protocols for reporting suspected breaches.
When evaluating platforms, prioritize those offering granular consent management. For example, employees should be able to control who sees their development goals or self-assessment notes. This balances transparency with privacy in distributed work environments.
Integrate performance systems with your organization’s Identity and Access Management (IAM) framework to maintain consistent security policies across all HR technologies. Regular penetration testing and simulated attack drills further harden systems against evolving threats.
Implementing Online Performance Management Systems
This section outlines concrete steps to deploy digital appraisal systems effectively. You’ll learn how to execute a structured implementation plan, manage organizational change, and evaluate the success of your system.
Five-Step Implementation Process: Needs Assessment to Training
Follow this sequence to minimize disruptions and maximize system adoption:
Conduct a needs assessment
- Identify gaps in your current appraisal process (e.g., delayed feedback, inconsistent metrics).
- Define measurable goals for the new system, such as reducing appraisal cycle time by 30% or increasing employee engagement scores.
- Survey employees and managers to prioritize required features like 360-degree feedback or real-time goal tracking.
Select a system aligned with business requirements
- Compare platforms based on scalability, integration with existing HR software (e.g., payroll systems), and compliance with data security standards.
- Prioritize user experience: A cluttered interface will reduce adoption.
Develop a phased rollout plan
- Start with a pilot group of 50-100 users to identify technical or workflow issues.
- Schedule data migration from legacy systems during off-peak hours to avoid downtime.
- Set clear deadlines for each implementation phase, including data validation checkpoints.
Test system functionality
- Simulate full appraisal cycles with dummy data to verify automated workflows, notification triggers, and report generation.
- Confirm role-based access controls prevent unauthorized data viewing.
Deliver role-specific training
- Create separate training modules for employees (submitting self-assessments), managers (conducting evaluations), and admins (generating analytics).
- Provide quick-reference guides for common tasks like updating objectives or resolving sync errors.
Change Management Strategies for Adoption
Resistance to new systems often stems from unclear benefits or fear of increased workload. Use these tactics to drive acceptance:
Communicate transparently before launch
Explain how the system simplifies tasks, such as replacing manual form submissions with automated reminders. Address concerns about data privacy upfront.Involve stakeholders early
Recruit department leaders and high-performing employees as beta testers. Their feedback refines the system, and their advocacy builds peer trust.Address skill gaps proactively
Offer pre-launch workshops for employees uncomfortable with technology. Pair them with digital mentors during the first appraisal cycle.Align the system with existing workflows
Integrate the platform with tools employees already use, like email clients or project management software. Avoid creating parallel processes.Monitor and adjust post-launch
Track support ticket trends during the first 90 days. If 40% of issues relate to goal-setting features, deploy additional training or simplify the interface.
Measuring Implementation Success Rates
Quantify system effectiveness using these key performance indicators (KPIs):
User adoption rate
Calculate the percentage of employees completing appraisals through the new system versus legacy methods. Aim for ≥85% adoption within six months.Process efficiency gains
Measure time saved per appraisal cycle. For example, if managers previously spent eight hours monthly on evaluations and now spend five, you’ve achieved a 37.5% efficiency gain.Data accuracy improvements
Compare error rates in performance data before and after implementation. Automated systems typically reduce manual entry errors by 50-70%.Employee engagement impact
Use pulse surveys to assess whether the system provides clearer performance insights. Look for a 10-15% increase in responses agreeing with “I understand how my work contributes to company goals.”System ROI
Calculate cost savings from reduced paper usage, IT support hours, or HR labor. Factor in soft benefits like improved retention rates linked to frequent feedback.
Adjust your strategy based on these metrics. If adoption lags in specific departments, conduct targeted training. If data accuracy remains unchanged, audit integration points with other software for sync failures.
Addressing Common Challenges in Digital Appraisals
Remote performance evaluations in online management systems introduce unique challenges that require deliberate strategies to resolve. You’ll need to address algorithmic bias, sustain engagement in virtual settings, and ensure compliance with diverse legal frameworks. Below are actionable solutions for these common issues.
Reducing Bias in Algorithm-Driven Assessments
Algorithm-driven tools streamline evaluations but risk amplifying biases if not properly managed. Bias reduction starts with transparency: clearly document how algorithms weigh performance metrics like task completion rates, communication frequency, or project outcomes.
- Audit algorithms regularly for skewed patterns. For example, if a system penalizes employees working non-standard hours, adjust the model to prioritize output quality over rigid timelines.
- Use diverse training data to prevent reinforcing historical inequities. If past promotions favored one demographic, exclude identifiers like age or gender from datasets used to train predictive models.
- Combine automated scores with human reviews. Assign managers to validate algorithm-generated ratings, focusing on subjective criteria like creativity or collaboration that machines might misinterpret.
- Enable employee feedback loops. Let employees flag perceived biases in their evaluations and use this input to refine assessment models.
Avoid treating algorithmic outputs as infallible. Systems trained on flawed data will replicate those flaws unless actively corrected.
Maintaining Engagement in Virtual Reviews
Virtual appraisals often suffer from low participation or distracted participants. Structure matters: design sessions to minimize fatigue and maximize interaction.
- Predefine agendas and time limits. Share discussion topics, self-assessment forms, and performance metrics at least three days in advance to let employees prepare. Limit meetings to 45 minutes unless complex issues require deeper discussion.
- Use interactive formats like live polls, shared document editing, or screen-sharing to demonstrate specific feedback. For team evaluations, break larger groups into smaller breakout rooms for peer reviews.
- Incorporate real-time recognition. Publicly acknowledge achievements during reviews using team-wide chat channels or virtual “badges” integrated into your management system.
- Train managers to avoid monologues. Use open-ended questions like “How do you think this project aligned with your goals?” to encourage dialogue.
Technical issues can derail engagement. Test platforms for compatibility with assistive technologies, and provide guides for troubleshooting common problems like audio delays or screen-freezing.
Legal Compliance Across Jurisdictions
Data privacy laws and employment regulations vary by region, making cross-border compliance a critical concern. Noncompliance risks fines, lawsuits, or revoked operating licenses.
- Map data flows to identify where employee information is stored, processed, or transferred. If your system uses cloud servers in multiple countries, ensure each location adheres to local laws like the GDPR or CCPA.
- Implement granular consent mechanisms. Allow employees to opt out of specific data collection practices, such as screen monitoring or email sentiment analysis, unless legally required for their role.
- Standardize record retention policies. Automatically delete appraisal records after the mandated period—for example, two years under some U.S. state laws versus five years in parts of the EU.
- Adapt evaluation criteria to local labor laws. In regions prohibiting performance-based termination, avoid metrics that directly tie productivity scores to dismissal decisions.
Regularly update compliance protocols as laws change. For instance, some jurisdictions now restrict the use of AI in hiring or promotions, requiring manual overrides in appraisal systems.
By addressing these challenges proactively, you create a fairer, more engaging, and legally sound appraisal process. Focus on continuous improvement: refine algorithms, experiment with engagement tactics, and monitor regulatory updates to keep your system effective.
Future Trends in Performance Analytics
Emerging technologies are transforming how organizations measure and manage employee performance. These innovations shift evaluation from static annual reviews to dynamic systems that provide actionable insights. Three key developments—AI-driven prediction models, real-time productivity tracking, and blockchain-based record systems—are redefining what’s possible in performance analytics.
AI-Powered Predictive Performance Modeling
AI transforms raw performance data into forecasts that help managers anticipate outcomes rather than just report past results. Machine learning algorithms analyze historical patterns in:
- Task completion rates
- Skill development trajectories
- Team collaboration metrics
- Project success correlations
These systems identify employees at risk of underperformance 6-12 months before issues become apparent. For example, an AI model might flag decreased code deployment frequency in software engineers as a precursor to burnout or disengagement.
Predictive models integrate with existing HR tools like:
- Learning management systems (LMS)
- Project management platforms (
Jira
,Asana
) - Communication apps (
Slack
,Microsoft Teams
)
You configure thresholds for automated interventions—triggering mentorship programs when skill gaps exceed 15% or adjusting workloads when stress indicators surface.
Real-Time Productivity Dashboards
Live dashboards replace quarterly reports, showing performance metrics updated every 15-60 seconds. These interfaces display:
- Current task progress against deadlines
- Meeting participation analytics (speaking time, agenda contributions)
- System usage rates for critical tools
Managers access role-specific views:
- Developers: Lines of code committed, pull request turnaround
- Sales Teams: Call duration patterns, CRM update frequency
- Customer Support: Ticket resolution times, sentiment analysis of client interactions
Alerts notify you when metrics deviate from baselines. A support agent’s average resolution time spiking by 40% triggers an immediate check-in instead of waiting for monthly reviews.
Dashboards prioritize customizable visualizations:
- Heatmaps showing peak productivity hours
- Scatter plots correlating training hours with output quality
- Gantt charts mapping project phase efficiencies
Blockchain for Immutable Performance Records
Blockchain creates tamper-proof performance histories that solve two critical problems: data integrity and verification speed. Each evaluation entry gets timestamped and cryptographically linked to previous records, preventing retrospective edits without leaving an audit trail.
Key applications include:
- Skill Certification Logs: Permanently stored certifications (cloud architecture, compliance training) visible across departments
- Peer Review Chains: 360-degree feedback stored as sequential blocks, ensuring no post-submission alterations
- Promotion Timelines: Unchangeable records of role changes and associated performance justifications
Smart contracts automate processes when predefined conditions meet:
- Automatic salary increases after completing 100 hours of leadership training
- Bonus payments triggered by exceeding quarterly OKRs by 10%
- Role eligibility unlocks upon mastering three new data analysis tools
Employees control access permissions through decentralized identifiers (DIDs), deciding who views specific performance segments. This balances transparency with privacy—a project manager might share sprint completion rates with stakeholders while keeping health-related productivity dips confidential.
Industries requiring compliance audits (finance, healthcare) use blockchain trails to prove evaluation processes meet regulatory standards. Dispute resolution accelerates when both parties access identical, unmodified historical data.
Implementation Considerations
Transitioning to these systems requires:
- Data hygiene audits to ensure existing performance records are machine-readable
- Cross-department alignment on which metrics reflect true productivity
- Employee training programs explaining how AI/blockchain tools augment—not replace—human judgment
Legacy systems often export data to cloud-based analytics platforms through APIs, while newer organizations build native integrations. Performance benchmarks reset quarterly as models incorporate fresh data, preventing outdated success criteria from skewing results.
The shift from evaluating what employees did to predicting what they can achieve demands updated policies. You’ll revise promotion criteria to include adaptability scores from AI models and incorporate blockchain-verified peer feedback into leadership potential assessments.
Key Takeaways
Here's what you need to remember about performance management systems:
- Align team metrics with organizational goals by creating clear goal hierarchies visible across departments
- Automate data collection using cloud-based tools for live insights, but validate system accuracy before full rollout
- Refresh evaluation criteria quarterly using workforce analytics to match shifting priorities or hybrid work patterns
Next steps: Audit your current appraisal process against these three principles and prioritize one upgrade this month.