Welcome to the Netflix Partner Help Center. Have a question or need help with an issue? Send us a ticket and we'll help you to a resolution.

Dialogue List Scope Of Work

User Guide here

How-To Videos here

Scope of Work 

  • For fulfillment of dialogue list, partners are required to use Netflix’s proprietary script authoring tool to create frame-accurate timing and accurate transcription of dialogue to the video provided. 
  • Script partners will be allocated work via Grand Bazaar and Source Management, then asked to assign their resources in Netflix’s dubbing portal, the central location to manage the authoring process.
  • Fulfillment will include one delivery to a preliminary (or locked) cut picture and update to the final picture. Any additional versioning is subject to conform rate and will be at the Netflix representative’s discretion. 
  • In the scripting authoring tool, partners will work against the most recent Production Locked Proxy or Final Proxy. Additional materials like Shooting Scripts may be available but are not required materials for fulfillment.
  • Automatic Speech Recognition technology is used to generate a base transcription of the English dialogue into script events; which include accurate timecode in and outs, along with generic speaker identity (diarisation).
  • The script event boxes contain timecode, character, event number, transcription, and ability to tag annotations. From these auto-transcribed events, the script author has the freedom to use the auto transcribed dialogue or clear the text to begin origination from scratch. 
  • The transcriber will also recognize that the timecode accuracy is sufficient, but may require small tweaks. We ask that in/out timecode's are within a 3 frame tolerance of the lip flap or spoken word start and end time.
  • Once authoring has completed, the author will submit in the tool, which will auto-update the source management request - Final Script

Service Level Agreement (SLA)

  • To be considered on-time, the partner must deliver the initial transcription within the agreed SLA of receipt of video. Requests for delivery in less than the specific SLAs according to runtime will be subject to rush fee.
  • Specific SLAs are as follows with runtimes rounding up to next 30-minute interval:
    • 30 min TRT - 3-4 business days
    • 60 min TRT - 4-5 business days
    • 90 min TRT - 6-7 business days
    • 120 min TRT - 7-8 business days
  • Versioning and conforms should be delivered within 48 hours.
  • Failure to meet requirements and style guide specifications will result in redelivery requests at partner cost that will impact metrics.

Dialogue List Style Guide

  • Authoring in the tool consists of transcription editorial and notes tagging. Each dialogue event is a contained unit and organized linearly in the dialogue editor section. Each event will require some level of editorial: In-Timecode, Out-Timecode, Source, Dialogue transcription. Notes are annotations which are used to capture important details to guide the dub / subtitling process. When creating a note, the user is prompted to select the type of note then add any additional text to the note. Notes have parent and child attributes as well as free form text. 
  • Types of Dialogue Editorial
    • Timecode should be frame-accurate to Netflix burn-in timecode. 
    • Source: Speaker names should be in all caps and include consistent, correct spellings for speakers (according to source material such as shooting script, credits or on-screen text). Unless multiple characters are saying the same dialogue at the same time, only include one speaker/source per event (exception: if dual speakers are included in the burned-in forced subtitle). For multiple people speaking in unison, include names separated by a forward slash in source event. Examples:
    • Dialogue should contain a complete accurate transcription of all dialogue and stutters/syllables as well as descriptions of mouth sounds in brackets (e.g.: [sighs], [laughs], [screaming], etc) in the dialogue event, before dialogue text begins. Include any descriptors that do not apply to specific dialogue or describe mouth sounds [e.g.: (long pause), (fades out to shot change), etc] in parentheses. Walla should also be indicated wherever prominent.
    • Best Practice: Utilize finalize dialogue stems to transcribe difficult to hear dialogue and individual lines within larger walla beds. Reach out to Netflix to see if available.
  • Types of Notes and Tags 
    • Context annotations: please include annotations for words/phrases in the annotation notes event box
    • Character 
    • On-Screen Text - Burned in subs should contain transcriptions of production-approved burned-in subtitles (for foreign dialogue, hard-to-hear dialogue, or sign language) as they appear in that cut of picture. Please include the character name in the source box in the authoring tool. Please include transcription of On-screen text in the main dialogue event box.
      • Content/timing of subtitles should match most recent cut and/or document provided by production. Line and subtitle breaks should be honored as they appear in documents.
      • Please split continuous dialogue up into rows according to corresponding subtitles if applicable. If timing of dialogue/subs are not the same, time according to dialogue. 
      • If burned-in subtitles contain content translating foreign dialogue, the dialogue must contain a description (e.g. [speaking spanish]) or transcription of the foreign dialogue in the same row. 
      • For hard-to-hear dialogue with burned-in subtitles, the dialogue event should still contain a transcription that will often match the content in the burned-in subtitle column.
      • For sign language with burned-in subtitles, the dialogue event should be left blank as no dialogue is being spoken.
  • Note: Final picture will often be clean of subtitles, but this content should stay in the final Dialogue List and be timed to match the final version of picture as it would occur had it been left in.
  • On-Screen Text - other types should include all prominent, pertinent on-screen text in the annotations editor. Please include transcription of On-screen text in the main dialogue event box. Please be accurate with your description of the event via the dropdown options and annotation notes section for further context (if needed). Please do not include tagging for production notes such as VFX, ADR, or Reshoot notes, as these will not be treated for localization. Please use the verbiage NONE in the source box in the authoring tool - this allows for the tag type to direct the text to the proper column in the xls export. 
      • On-Screen Text - Graphics & Inserts
      • On-Screen Text - Principal Photography
      • On-Screen Text - Episodic Title
      • On-Screen Text - Netflix Credits
      • On-Screen Text - Main Title 
      • Place transcribed & translated in the main dialogue box
      • Leave annotation notes section for annotations
      • Screen_Shot_2020-07-24_at_11.54.00_AM.png
  • Foreign dialogue treatment:
    • Please transcribe and italicize all words and short phrases foreign to the original version language in the Dialogue column
    • For longer passages that cannot be transcribed without the assistance of a native speaker, indicate the language being spoken in brackets (e.g.: [speaks German]).
    • Please use the annotation tags and the dropdown option "Foreign Dialog" to also indicate. We ask that the brackets and these tags be applied. 
    • Foreign_Dialog.png
    • If dialogue is intended to be understood or subtitled, translations will be provided either as burned-in subs in a locked cut or in a separate document from Netflix/production. Indicate language of speech in dialogue events (e.g.: [speaks German] and transcribe accompanying burned-in subtitles. 
    • If any mistranslations are noted in burned-in subtitles, please flag to Netflix before updating in the dialogue list.
    • For other general text style requirements not specified above, please refer to the English SDH Style Guide and apply the treatment to the Dialogue list. 
  • Songs & Ditties should contain transcriptions of vocals, whether licensed, un-licensed, original IP, or ditties. 
  • Archival Footage - TV news footage, court footage, previously released movie/TV content, etc.; i.e., anything filmed outside of the film/series itself should be transcribed and tagged as archival footage.
  • Export
    • The tool is set up to output a format that best suits our partners and internal stakeholders. This format resembles Netflix’s excel template before the tool was built. Therefore, the script partner should not have to worry about change logs, project info, or format specific to previous expectations. 
  • Off-screen dialogue treatment:
    • When a character speaks off-screen (still in-scene but not on-camera), please indicate by selecting (OFF) in the dialogue edit box.
    • If the character’s dialogue section alternates between on- and off-screen please indicate the order that it occurs, e.g.: (OFF/ON) for a character’s dialogue that starts off-screen and continues on-screen, (ON/OFF) for starting on-screen and going off-screen. No need to include more than two cues. The dialogue event box has a drop down option for both.
    • If a character is providing voice-over (not in-scene at all), please start a new time-coded dialogue entry for when voice-over begins and include (VO) at the end of the character’s name in the event.

Change List

  • This list will include changes/updates from the previous delivery of the Dialogue List.
  • Granular details about updated dialogue/on-screen text and summaries of global or broad changes (e.g.: eight-second offset at beginning of program, scene added at 01:22:30, etc) should be noted on the Annotations column.
  • Please refrain from using formatting (bold, underline or strikethrough) to indicate changes, as these might be lost when importing documents to different tools.



Operations Run Book

High Level Overview

The Script Authoring tool is a Netflix application to transcribe, translate, and tag timecode accurate dialogue events leveraging speech-to-text technology, outputting a robust dialog list for content localization (dubs & subs), supplementals, and marketing needs. Automatic speech recognition (ASR) prelays transcription, then pulls metadata to auto populate events, thus providing a rich event filled with character, timecode, annotation, and speech detectors. Netflix partner project managers are asked to assign resources, manage communication with Netflix representatives, and meet our SLAs. Netflix partners and their resources are asked to edit, populate, and proof transcription, translation, and tagging through a full watchdown. 

Dialogue List workflow [High Level]

  1. Netflix to assign title via Grand Bazaar
  2. Netflix to assign Final Script source requests
  3. NP3 enters Dub Portal for title management and resource assignment
  4. NP3 Transcription resource to enter Task Dashboard, complete task
  5. NP3 Proofer (QC) resource to enter Task Dashboard, complete task
  6. NP3 project manager to oversee completion and communication with Netflix
  7. Conformance: Automatic conform request when Final Proxy arrives in source management.
    1. Dubbing Operations will manually request conformance for any other locked cut or final cut - ie Locked Cut version 5, Final Proxy version 2

DUB PORTAL WORKFLOW

Overview

The Dub Portal tool is a dynamic dashboard providing control and operational visibility over the end-to-end dubbing process. Our vision is to provide accurate information, which is organized in a digestible manner, generating actionable tasks for all users. This is the project management application for both Netflix and our trusted partners for communication, title management, and linkage with all other tools in our ecosystem.

Partner Workflow (video here)

  • Dubbing or audio description partners will be assigned work via Grand Bazaar and Source Management, which will prompt the assigned title to the dub portal dashboard. 
  • From the Dub Portal dashboard, all assigned projects will be displayed with status’ and visibility on dates and source assets, such as locked cuts and source language.
  • For script authoring tasks, the NP3 dub project manager will assign work via the Dub Portal. 

Starship Onboarding

Dub Portal access is granted through Starship to NP3 project managers.Through Starship, NP3 project managers will be able to onboard transcribers, proofers, and other project managers. 

  • Dub Portal starship role
    • Application: Dub Portal
    • Role: Dub Portal User
      • Allows partner to view status’ across dubbing projects
      • Allows you to onboard other project managers to this role
  • Script Authoring tool starship role
    • Project Manager starship role
      • Application: Dub Script
      • Role: Vendor
        • Allows full admin rights over titles assigned to your studio 
    • Transcriber & Proofer starship roles
      • Application: Dub Script
      • Role: Dialog List Author
      • Role: Dialog List Proofreader

Scripting_Tool_screenshot_1.png

 

Dub Portal Dashboard

  • In the dashboard view, you will see your assigned titles and four columns with status’. 

Scripting_Tool_Screenshot_2.png

  1. Title, Movie ID, Format, Episode count, Thumbnail
  2. Dropdown to display Active, All, and Archived projects
  3. ORIG LANG - source language of title, identified by language code
  4. Launch date
  5. Picture Cuts availability
  6. Dialogue List progress - defined by percentage complete and number of lists needed
  7. Settings gear icon - email digest options, UX themes (light, dark)
  8. Project settings - ability to archive, save to ‘my projects’, and open in source management

Project Level

  • Once you click on the thumbnail or title name, you will enter the project.Scripting_Tool_screenshot_3.png
    1. Status of dialogue list assignment; Assigned, Partially Assigned, Unassigned
    2. Locked or Final Cut status: Top cut will display new, bottom cut is current / existing dialogue list
    3. Overall status of episode / standalone; Ready, In Progress, Completed
    4. View details of resources and project, plus ability to assign transcriber and proofer

Assignment

  • NP3 Project managers will assign their personnel, whether in-house or freelance for transcription and/or proofing tasks. 

Scripting_Tool_screenshot_4.png

  • In this window, you can start typing your resources email address and it will auto-populate for reference

ScriptingTool_screenshot_5.png

  • Once internal personnel are assigned to a project, they will receive an email notification with a link to take them to their project dashboard.

 

 

Was this article helpful?
4 out of 6 found this helpful
Powered by Zendesk