The document discusses using executable specifications to capture expert knowledge for the NRK media player project. Specifications were written using Gherkin and the SpecFlow framework to describe requirements. This allowed developers to work closely with domain experts and validate requirements through automated tests. Lessons learned include starting with acceptance criteria before end-to-end testing and using specifications as a communication tool between technical teams.
Unveiling the Tech Salsa of LAMs with Janus in Real-Time Applications
Staying close to experts with executable specifications and SpecFlow
1. Staying close to experts
with executable specifications
Vagif Abilov, Peder Søholt
2. Staying close to experts
with executable specifications
Vagif Abilov, Peder Søholt
3. Case study: NRK player
Norwegian Broadcasting Corporation (NRK)
• On-demand TV/Radio media player
• Access to the whole program archive
• Copyright restrictions
• Rich metadata information
6. Distributed domain knowledge
• End-user experience
• Program player
• Metadata database
• Media file distribution
• TV and radio programs are handled by different
subsystems
• Program index management
• Subtitle management
8. Knowledge gathering challenges
• Experts are engaged in multiple projects
• Developers are divided into multiple groups and placed
in different locations
• Unified terminology is defined but not fully adopted
• External systems bring their own terms and standards
• Specifications are frequently revised
• Interactive sessions with experts may reveal additional
design constraints
9. Program metadata
• Series -> Seasons -> Programs -> Indexes
• Metadata aggregate may contain
– Titles
– Descriptions
– Subjects
– Locations
– Contributors
• Original program metadata are usually not sufficient for
on-demand access, therefore part of the project scope
is extending metadata
10. Obtaining metadata
• Metadata come from 4 different data sources
(programs/transmissions, indexes, subtitles, radio)
• Metadata sources belong to different subsystems
managed by different experts
• Protocols include SOAP and REST, both push and pull
methods
• Data transformation rules are established through a
series of interviews with experts and are revised with
each iteration
11. Metadata validation workflow
Metadata
bank
Program
transmissions
TV program
metadata and
indexes
Radio program
metadata and
indexes
Subtitles
Given
When When
Then
12. BDD framework selection
• Main development is on .NET platform
• Early consideration: from Fitness to StoryTeller
• Gherkin appeared to be a good communication
language choice
• Collected scenarios became a foundation for
executable specifications and acceptance tests
• Considered: Cuke4Nuke
• Selected: SpecFlow
13. Example of the specification
Scenario: Update rights for a TV program
Given ODA database contains raw rights for program "NNFA20400099"
| VodPatternId | PublishStart | PublishEnd |
| 83223739100 | 2012-11-01 | 2012-12-01 |
And ODA database does not contain raw rights with pattern "832237391"
When MgxGranitt adapter service receives pattern with data
| ChangeType | AlternateId | Id | StartDateTime| EndDateTime |
| NewOrUpdate | NNFA20400099 | 832237391 | 2012-11-11 | 2012-12-11 |
Then MgxGranitt adapter service should return OK
And ODA database should contain rights for program "NNFA20400099"
| PublishStart | PublishEnd |
| 2012-11-01 | 2012-12-11 |
18. Support for LINQ operations
• We may need to specify match type:
• Equivalence (same items in both collections, order is
insignificant);
• Equality (same items in the same order);
• Subset (one collection is a subset of another collection);
• Intersection (collections have common items).
19. Lessons and discoveries
• Initial temptation to focus on end-to-end testing
• “Don’t use executable specifications as end-to-end
validations“ (Gojko Adzic)
• “Acceptance tests written for pure testing sake are
normally a sign that the code is badly designed” (Matt
Wynne)
• Even though Gherkin adds extra
development/maintenance cost, some developers
found convenient to use it as specification/document
tool among technical people
21. User stories and criterias
• Product owner works together with interaction designer
and us developers to write User Stories with
acceptance criterias
– team work during sprint
– Criterias in Given, When, Then format
• People with strong opinions
• Report showing all specifications.
• Report showing current sprint specifications
22. Gherkin and SpecFlow
• Write tests using SpecFlow
• Divided in two ways of writing specs
• Acceptance criteria tests from customer
• Specifications of how and where we have solved the criteria, this might
be technical specs
23. SpecFlow and SpecRun
• Running tests using SpecRun
• Can run parallel tests
• Creating reports using SpecRun
• Unfortunately no SpecLog usage yet
30. Web testing - RazorGenerator
• Generates the views in the MVC model
• Test the actual HTML output
• Fast
31. Web testing - Selenium
• Web testing with Selenium / Webdriver
• if you need to test specific stuff on the web
• Test javascript/AJAX functionality in the
• Slow, but tests actual functionality in the webbrowser
32. Web testing - Selenium
• Use SauceLabs
• Selenium/Webdriver testing in the cloud
• Can run parallel tests
• Lots of browsers and OS to choose between
• Can test closed in house servers using SSH tunneling
33. Web testing tips
• Use abstraction layer, easier maintenance
• Ex. page-objects.
• Do not test specifically a tag with hardcore xpath or css
• Ex.Test that HTML-body has the text you are looking for
• Do not use for example xpath:
• /html/body/div/div[2]/div/div/section/article/hgroup/h1
• Do not use for example CSS-selector:
• html.gecko body#program.a-showprogramsync div#sfWrap
div#sfMain div#main div.box section#programMetaData.container
article#episode.span-10 hgroup h1
34. Deleporter
• Cross-Process Code Injection for ASP.NET
• For mocking data, when running webtests on a different
machine than where you run your tests
• Developed by Steven Sanderson
39. Deleporter
• For mocking data, when running webtests on different
machine than where you run your tests
• SECURITY RISK - DO NOT DEPLOY TO
PRODUCTION!
42. Conclusions
• Work together with your customer to create good specs
• Please include a Interaction designer and a developer
• Use tags in SpecFlow
• Generate HTML from views instead of web tests –
much faster
• When using web tests, use Deleporter for mocking the
server
43. Conclusions
• Executable specifications have proven their efficiency
when gathering and maintaining expert knowledge
• Wide adoption requires enthusiasts and patience
• Use of open source software enables adjusting it for
team’s specific needs
• Even within the same organization practice and
interpretation of terms may differ significantly