Privacy Policy

How NativeScore handles assessment data.

This page explains collection, processing, retention, and rights related to account, assessment, certificate, and proctoring information.

Policy Overview

Comprehensive privacy framework for secure assessment operations

1. Scope and purpose of this privacy policy

This Privacy Policy describes how NativeScore processes information when organizations use the platform for language assessment, candidate readiness testing, campaign-based screening, and certificate verification workflows. The policy applies to public pages, authenticated dashboards, attempt sessions, and post-test review surfaces where personal information or assessment evidence may appear. It is designed so institutions, hiring teams, and candidates can understand what data is handled, why it is handled, and how controls are applied through role-based permissions.

The platform is used in different operating models, including schools, colleges, training partners, recruitment teams, and multilingual service operations. Data handling therefore reflects both platform-level safeguards and campaign-level choices made by customer administrators. When an admin enables features such as secure fullscreen flow, screen share checks, camera capture, microphone recording, or transcript support, relevant technical data is processed to run the requested function. This policy explains those processing pathways in plain language.

If a customer has executed separate contractual terms that include stricter retention windows or region-specific obligations, those terms can apply in addition to this policy. In case of conflict between this page and a signed agreement, the signed agreement governs the extent permitted by law. For all general website visitors and non-contracted usage, this published policy remains the primary reference.

2. Categories of data we collect and process

NativeScore may process account and identity attributes such as candidate name, assigned identifier, email, phone number, campaign membership, and role information for admin, supervisor, HR, or agent users. We may also process technical metadata required for secure delivery, including browser and device context, timestamped events, session state, and permission signals related to camera, microphone, or screen-share setup where those controls are required by campaign configuration.

During assessment execution, the platform processes section-level responses, objective and subjective answer payloads, test navigation events, timing signals, start and submit milestones, and scoring artifacts generated by workflow rules. For speaking and proctored flows, additional evidence may include transcript payloads, media file references, upload logs, and secure attempt intelligence records that link activity to a campaign and attempt identifier. This information is used to deliver a valid testing process and support authorized review.

For certificate and verification workflows, data may include certificate ID, linked attempt summary, result status, section breakdown metrics, issuance timestamp, and verification lookups. We process this information to provide trust-preserving result validation and to prevent mismatched certificate representations across user roles. Public verification surfaces are intentionally constrained to relevant fields so sensitive internals are not broadly exposed.

3. Legal basis and legitimate platform use

Processing activities are performed for legitimate business purposes such as delivering contracted assessment services, maintaining platform integrity, enforcing campaign security controls, supporting operational accountability, and preserving auditable results for organizations that rely on language evaluation data in admissions or hiring decisions. In many cases, processing is necessary for contractual performance between NativeScore and the customer organization that provisions the campaign.

Where required by applicable law, processing may also rely on consent or equivalent lawful grounds, particularly for device permissions connected to recording features. Campaign owners are responsible for ensuring that candidates are informed of applicable assessment conditions before attempt start. NativeScore provides technical controls and policy disclosures, while customer organizations remain responsible for lawful deployment within their jurisdiction and institutional context.

We do not support unauthorized surveillance. Platform recording and monitoring controls are intended only for declared assessment governance, quality assurance, and integrity review. Misuse outside legitimate exam operations is prohibited by platform terms and may trigger suspension of access or legal action where relevant.

4. How we use data during assessment delivery

Data is processed to initialize attempts, enforce section sequencing, maintain timing controls, and apply campaign-defined constraints such as fullscreen and secure test flow requirements. These controls are essential for fairness and consistency because candidate outcomes should reflect language capability, not inconsistent delivery conditions. Session telemetry allows the platform to detect whether required prerequisites are met before progression.

Response data is used to calculate objective scoring, route subjective answers for review, generate section summaries, and maintain a complete attempt trail for post-test analysis. Where speech-to-text integrations are enabled, audio segments and transcript payloads are processed to provide review support and scoring context. The platform architecture is designed so this information can be accessed only through authorized roles with campaign scope checks.

Post-assessment, data is used for result publication, certificate generation, downloadable records, and verification endpoints. This continuity reduces manual reconciliation and supports organizations that need to validate candidate performance after the test window has closed.

5. Recording data, proctoring evidence, and media handling

If enabled by campaign settings, NativeScore may process screen recordings, camera streams, microphone captures, and related media metadata. These artifacts are used for integrity verification, incident review, and controlled supervisor or admin analysis. Media collection is configuration-driven: if a campaign does not require a particular stream, the platform should not process that stream for the attempt.

Recording persistence follows operational logic associated with complete test submission or configured save checkpoints. Media files may be organized in storage paths that support campaign-level management so customers can retain or purge evidence in an administratively manageable way. Retention behavior can differ based on plan selection and contractual requirements. Access to media references is restricted to authorized roles and should never be treated as a publicly shareable asset.

Customers are expected to communicate recording expectations clearly to candidates before test start. NativeScore supports this through instruction prompts, permission checks, and attempt gating controls, but institutional policy communication remains the responsibility of the deploying organization.

6. Data sharing, subprocessors, and third-party integrations

NativeScore does not sell candidate personal information. Data may be shared with authorized users of the same customer organization according to role and campaign access rules. Limited sharing may occur with infrastructure providers and technical subprocessors that support hosting, storage, delivery, analytics, monitoring, or speech processing capabilities, strictly to the extent necessary to deliver platform functions.

When customers configure third-party speech-to-text or language intelligence providers, relevant audio and metadata can be transmitted to those services according to integration settings chosen by the customer. Customers should review third-party terms and regional compliance requirements before enabling such integrations. NativeScore can provide integration controls, but responsibility for vendor suitability within a specific legal regime remains with the customer organization.

Government requests, legal obligations, or court orders may require disclosure in limited circumstances. Where permitted, we aim to narrow scope, verify legal basis, and preserve transparency with the affected customer organization.

7. Retention periods and deletion approach

Retention duration depends on data category, plan configuration, and contractual terms. Attempt records, score artifacts, and operational logs are retained as needed to run the service, support dispute resolution, and maintain accountable reporting. Certificate verification records may be retained for long-term trust continuity where lifetime verification is part of the service model.

Media evidence retention may be shorter or longer depending on customer policy and storage governance settings. Customers that require campaign-level media cleanup should coordinate operational retention controls with platform support to avoid accidental deletion of records that are still needed for compliance or review. Retention decisions should be documented by the customer in line with internal policy and applicable law.

When valid deletion requests are received and no overriding legal, contractual, or security obligation requires continued retention, data may be removed or anonymized using controlled procedures. Some backups may persist for a limited period as part of disaster recovery architecture before final expiration.

8. Access control, security safeguards, and incident response

NativeScore uses layered controls including authentication checks, role-scoped authorization, campaign-level visibility restrictions, transport security, and audit-friendly workflow design to reduce unauthorized access risk. Administrative operations are expected to follow least-privilege principles so users only see data required for their responsibilities.

No digital platform can guarantee absolute security, but we continuously prioritize practical protections that reduce likelihood and impact of incidents. This includes monitoring for anomalous behavior, applying infrastructure hardening, and maintaining process controls around media handling and sensitive result workflows. Customers are also responsible for safeguarding credentials and enforcing organizational security hygiene on endpoint devices.

In the event of a confirmed incident affecting personal data, response actions may include containment, investigation, remediation, and legally required notifications. Timelines and notification pathways depend on jurisdiction and contractual terms.

9. User rights, requests, and grievance channels

Depending on applicable law, individuals may have rights to request access, correction, deletion, restriction, portability, or objection regarding personal data processed through NativeScore. Because many users are provisioned by institutions or employers, request handling may involve both NativeScore and the relevant customer organization that acts as data controller for campaign decisions.

To submit a privacy request, users can contact the support channel listed on this page and include enough detail to identify the account, campaign, and request type. We may require verification steps to protect account security and prevent unauthorized disclosure. Some requests may be limited where legal obligations, fraud prevention, contractual duties, or public-interest grounds apply.

If a user believes their privacy concern has not been handled appropriately, they may escalate through the documented grievance route or relevant data protection authority, depending on jurisdiction.

10. Cookies, analytics, and public pages

Public pages may use essential technical cookies or local storage mechanisms required for session continuity, security, and basic performance. Where analytics tools are used, information may be aggregated to improve website quality, understand traffic behavior, and prioritize support resources. Analytics usage is governed by minimization principles and should not expose restricted attempt data on public surfaces.

Users can manage cookie preferences through browser settings, but disabling essential mechanisms may impact site functionality. Assessment sessions in particular rely on stable browser behavior for secure flow and timing accuracy. Organizations should provide candidates with practical device guidance before high-stakes attempts.

Public verification routes are designed to provide targeted certificate checks and should not be interpreted as a broad personal data search interface.

11. International transfers and jurisdictional considerations

Where infrastructure or subprocessors operate across regions, personal information may be processed in jurisdictions different from the user location. In such cases we aim to apply suitable safeguards through contractual and technical measures appropriate to the service context. Customers with strict residency requirements should discuss deployment constraints before production rollout.

Cross-border obligations vary by law. Customer organizations remain responsible for evaluating whether their use of the platform meets local legal requirements, including employee notice duties, candidate consent obligations, and public-sector procurement rules where applicable.

NativeScore can support implementation guidance, but legal interpretation for a specific institution or employer should be obtained from qualified counsel.

12. Changes to this policy and contact details

This policy may be updated when product capabilities, legal requirements, or operational practices change. Material updates will be reflected on this page with revised wording so users can review current handling expectations. Continued use of the platform after updates means the revised policy applies to the extent permitted by law and contract.

For privacy questions, security concerns, or policy clarification, contact NativeScore at contact@nativescore.com or call 7827806009. Please include your organization name, campaign reference, and user role so the team can route your request accurately and respond faster.

Last reviewed: 15 March 2026.

Need clarification on campaign-specific privacy obligations?

Contact the NativeScore team with your campaign model, data residency needs, and retention policy requirements for implementation guidance.

contact@nativescore.com

FAQ

Frequently asked questions

What data does NativeScore collect during assessments?

Assessment data can include profile inputs, responses, scoring records, and optional proctoring artifacts such as screen, camera, and voice recordings.

How long are certificate details retained?

Certificate verification records are retained for long-term public validation, while download windows can vary by selected plan.

Who can access attempt intelligence?

Access is role-based and campaign-scoped for admins, supervisors, and HR users according to configured permissions.

Industry Knowledge Base

Intent-driven guides for HR teams, schools, colleges, and multilingual hiring programs

Online Assessment Trend

Why secure online language and typing assessments are now a core business requirement

Hiring and education teams are moving from offline screening to digital-first evaluation because recruitment speed, distance learning adoption, and distributed operations demand measurable online outcomes. NativeScore supports this transition by combining writing, listening, reading, speaking, and typing workflows in one environment that is easy to operate but detailed enough for serious decision making. This maps directly to high-intent search behavior where users look for online language assessment platform, secure remote test software, and typing test with certificate for real hiring readiness.

Organizations no longer need only question delivery. They need test integrity, candidate comparability, role-based review paths, and verifiable credentials that can be trusted by external stakeholders. NativeScore addresses these needs through campaign-level controls, progress tracking, and auditable output structure so that assessments are not one-time forms, but process-ready evaluation systems. This helps HR teams, colleges, and training institutes run scalable screening with better confidence and lower process ambiguity.

From a growth perspective, a platform that aligns to real intent queries and workflow outcomes can help institutions attract qualified candidates and partners faster. Clear digital assessment operations reduce failure caused by manual confusion, while structured reporting enables faster decisions in interview pipelines and placement programs.

Special Offer

INR 121 special pricing for schools, colleges, and students who need practical exam exposure

NativeScore includes a special INR 121 track so schools and colleges can give students meaningful hands-on practice on real online assessment behavior. This offer is designed for institutions that want affordable access without compromising the practical value of the test flow. Instead of only theory training, students can experience timed sections, answer discipline, typing pressure, and result-based performance checkpoints that better reflect modern hiring and training expectations.

For student communities, this special plan is useful because many companies run mandatory online pre-HR tests before interview rounds. Candidates who are academically strong often underperform when they face strict timers, typed response constraints, and language-based filters on unfamiliar digital interfaces. With repeated practice under structured conditions, they build confidence and improve first-attempt outcomes in actual recruitment assessments.

Institutions can also position INR 121 as an employability enablement initiative. Placement cells can conduct periodic readiness drives, identify weaknesses in comprehension and typing speed, and run focused improvement cycles backed by measurable score evidence. This makes student support more practical, data-aware, and outcome-focused.

HR Use Case

How HR teams evaluate Hindi, English, and regional language capability before final interviews

NativeScore helps HR and recruitment operations run consistent language screening before manager rounds. Instead of relying only on resume claims and quick telephonic checks, teams can evaluate measurable ability across comprehension, expression, listening interpretation, and typing consistency. This is especially valuable in support operations, sales communication, multilingual service desks, and call center processes where communication quality directly affects customer experience.

A structured pre-HR screen saves interview bandwidth by filtering unprepared candidates early and routing stronger profiles forward with evidence. Recruiters can compare candidates in the same scoring frame and share score snapshots with hiring managers in a clear, process-friendly format. This improves interview quality and reduces cost per qualified shortlist because candidate capability is validated before high-value interview time is consumed.

For organizations managing diverse regions, multilingual screening can be mapped campaign-wise, ensuring each role is evaluated in the required language context. NativeScore supports this operational model without forcing teams to maintain separate disconnected systems for each language path.

Government and Public-Service Style Hiring

Regional language readiness support for high-volume service recruitment contexts

Government-facing and public-service style recruitment workflows often require candidates with practical language control in regional contexts. NativeScore can support these requirements through configurable campaign setup and multilingual evaluation paths. Teams can run structured Hindi, English, and regional-language assessments with consistent scoring logic to improve selection quality in large service environments.

Third-party assessment support becomes important when hiring volume is high and fairness expectations are strict. NativeScore contributes by providing transparent process structure, campaign governance controls, and verifiable certificate outputs that can be validated independently. This combination helps build trust in screening quality while reducing dependence on manual evaluation variability.

Where call center and citizen-support processes need regional fluency, NativeScore can be used to benchmark candidate readiness before onboarding. This protects service quality and ensures language fit is validated in measurable terms rather than assumed from self-declaration.

Case Study

Campus fresher readiness case: preparing candidates for MNC typing tests and online pre-HR assessments

In a common placement scenario, students completed college with strong academic performance but limited familiarity with online screening tools used by large employers. Many faced typing tests and communication assessments before HR rounds and were eliminated due to interface pressure, timing stress, and low confidence rather than lack of potential. NativeScore was adopted as a practical readiness layer to bridge this gap.

The placement team ran repeat mock cycles with targeted feedback for typing speed, reading focus, listening retention, and response quality. Because each attempt produced structured score data, mentors could coach students with evidence instead of generic advice. Over multiple cycles, candidates reported stronger comfort with timed exam conditions and better execution in employer-facing online rounds.

This case pattern shows why practical digital exposure matters for freshers. NativeScore transforms readiness from one workshop to a measurable progression system, helping institutions reduce avoidable failure in pre-HR screening pipelines and improve candidate confidence during campus hiring seasons.

Distance Learning

How schools and colleges use online assessments to prepare students for digital-first careers

Distance learning and hybrid programs have made digital assessment behavior a foundational skill. Students now need to operate confidently in secure online environments, complete timed tasks, and submit high-quality responses without classroom dependency. NativeScore helps institutions train this behavior through realistic practice campaigns that mirror modern exam and hiring conditions.

Colleges can integrate assessments into placement readiness calendars, language labs, and employability modules so students gain repeated familiarity with real platform workflows. This includes typing discipline, comprehension under time constraints, and response quality tracking. The result is not only score improvement but also reduced anxiety when students face mandatory online tests from recruiters and corporate hiring partners.

For schools, early exposure to guided digital assessments creates long-term benefits in confidence, self-evaluation, and communication skill development. Institutions can use this model to support both academic progression and future workforce preparedness.

Typing Assessment Need

Typing tests as mandatory filters in BPO, support, and data-driven job roles

Many large employers include typing and online language screens before HR conversations, especially in roles that require real-time communication, ticket handling, and service documentation. NativeScore aligns with this hiring reality by offering practical typing and language capability checks that can be repeated for improvement and benchmarked for eligibility decisions.

Students and candidates often underestimate typing as a selection criterion until they encounter actual screening rounds. With NativeScore, institutions and training partners can build targeted preparation programs where participants practice speed, accuracy, and comprehension together, not in isolation. This integrated approach is more aligned with real recruitment tools than standalone typing games.

When typing outcomes are tied to broader language assessments, organizations get a stronger capability picture. Recruiters can identify candidates who not only type fast but also understand instructions, process input correctly, and respond in role-relevant language standards.

Supervisor and Admin Intelligence

Operational visibility for campaign owners, reviewers, and compliance teams

NativeScore gives operational stakeholders practical control through role-based dashboards. Admin users can configure campaigns, supervisors can review attempt intelligence, and HR teams can evaluate candidate outcomes based on authorized scope. This design supports accountability in environments where multiple teams participate in screening decisions.

Visibility is especially important when organizations run multilingual or high-volume assessments. Decision makers need a clear view of status, progress, and quality indicators without navigating disconnected systems. NativeScore centralizes this workflow so review efficiency improves while process integrity remains intact.

For institutions and enterprises, this governance model supports repeatable execution. Teams can standardize workflows, reduce operational confusion, and maintain quality as assessment scale increases across departments or campuses.

Certificate Trust

Public verification and long-term value of digital language credentials

Certification is valuable only when it can be trusted by third parties. NativeScore includes certificate generation with public verification so employers, institutions, and partners can validate credential authenticity using a unique ID. This supports credibility in hiring and learning ecosystems where proof of skill must be checked quickly and reliably.

Public verification also helps candidates because they can share credentials with confidence across opportunities without repeated manual explanations. Recruiters and reviewers can validate results independently and proceed with decision workflows faster. This reduces friction between candidate achievement and employer trust.

In long-term operations, certificate verification becomes a durable quality signal for the platform itself. It demonstrates process transparency and improves confidence in assessment outcomes beyond the moment of test completion.

Multilingual Capability

Hindi, English, and regional language evaluation under one controlled framework

Language readiness varies by role and geography. NativeScore supports multilingual configuration so campaigns can evaluate candidates in Hindi, English, and regional languages according to job context. This flexibility is critical for organizations hiring across markets where language fit directly influences service performance.

Regional language assessment is increasingly important in customer support and distributed service processes. NativeScore helps organizations validate communication quality before onboarding, reducing downstream service risk and training overhead. Institutions can also use this model to prepare students for region-specific opportunities with better confidence.

Running all language paths in one framework improves governance and reporting consistency. Teams avoid fragmented tools and maintain a standard process for scoring, review, and certificate validation across language tracks.

Implementation Roadmap

From pilot to scale: practical rollout steps for institutions and employers

Most successful deployments follow a phased model: start with one campaign, validate score reliability, train operational users, and scale based on outcome confidence. NativeScore supports this progression through campaign-level controls that can be expanded without redesigning the entire workflow. This lowers implementation risk and accelerates adoption.

During pilot phase, teams can focus on one high-impact use case such as pre-HR language screening or placement typing readiness. Once baseline effectiveness is proven, additional role tracks, language paths, and review policies can be introduced in a controlled manner. This keeps expansion predictable and measurable.

A phased roadmap is also budget-friendly because organizations can align license growth with demonstrated value. NativeScore can therefore support both early-stage adoption and enterprise-scale expansion while maintaining process continuity.

Custom Tooling by License Size

Requirement-based customization for high-volume assessment programs

When candidate volume grows, organizations often need custom workflow behavior: role-specific campaign paths, reporting depth changes, governance rules, and specialized process alignment. NativeScore can support requirement-based configuration linked to license scale so customers can move beyond generic setups and operate with better strategic fit.

This model is useful for large campus networks, enterprise hiring programs, and multilingual service operations where one-template logic is not enough. By mapping configuration to license and process size, NativeScore enables practical customization without losing platform consistency.

Custom tooling can include content strategy alignment, review queues, certificate governance, and language-specific deployment controls. The result is a scalable assessment system designed around real operational outcomes rather than isolated test execution.