W1siziisimnvbxbpbgvkx3rozw1lx2fzc2v0cy9uywxhc2nlbmqvanbnl2jhbm5lci1kzwzhdwx0lwpvyi5qcgcixv0

News

Liquid error: No such DB template ''sections/sub-banner''

Senior Data Manager: Leeds

  • Location

    Leeds, West Yorkshire

  • Sector:

    Rail & Civil Infrastructure

  • Job type:

    Temporary

  • Salary:

    £500 - £550 per day + Inside IR35

  • Contact:

    Daniel Brokenshire

  • Contact email:

    Daniel.Brokenshire@talascendint.com

  • Job ref:

    BBBH92921_1627996228

  • Published:

    about 2 months ago

  • Duration:

    6 months

  • Expiry date:

    2021-08-24

  • Startdate:

    Asap


Senior Data Manager required in Leeds on a contract basis with specialist knowledge and development expertise in using data management tools such as Databricks, SSIS, R, Python, SAS DI Studio, or other advanced industry standard tools.
Experience of using databricks on AWS would also be desirable

Work outputs

  • Deliver data management processes to provide data for customers of the DigiTrials service, providing dedicated resource in support of this priority service.
  • Build data pipelines that run according to automated schedules and meet the requirements of the DARS agreements.
  • Work with customers to define the specific outputs from the DARS catalogue of products and ensuring that outputs reflect the legal constraints of the agreement, in terms of data items, frequency, start/end of agreement, pseudonymisation
  • Ensure that processes constructed are aligned to existing internal standards
  • Within the delivery squad, work with Product Owner and DigiTrials service to prioritise the delivery of processes for DigiTrials customers.
  • Ensure that processes are constructed using a consistent approach and can be adapted, amended or fixed accordingly by other squad members in line with Agile working principles



Knowledge
Essential - Generic

  • Advanced specialist knowledge and development expertise in using data management tools such as Databricks, SSIS, R, Python, SAS DI Studio, or other advanced industry standard tools.


Skills and Experience
Essential - Generic

  • Proficient in identifying and managing demand in areas where there are shared and conflicting agendas.
  • Proficient in identifying and defining the resources needed to deliver to a specification.
  • Experience of working in and leading a multidisciplinary team using agile methodologies.
  • Experience developing and maintaining code, using languages such as R, Python, Spark SQL SAS, T-SQL, PL/SQL and Scala.
  • Significant experience managing and automating data, using latest technologies.
  • Experience of analysing raw data and assessing/ reporting on data quality, both to data suppliers and to senior management.


Data Management/Optimisation

  • Significant data management and optimisation expertise, covering process design; code development (using languages such as R, Python, SQL or SAS); data architecture; testing and assurance; and reference/ master data management.
  • Experience of successfully delivering data management services in large and complex organisations.
  • Proven experience gathering, analysing, interpreting and prioritising customer requirements/specifications for complex and technical processes.
  • Experience developing clear process documentation, based on customer requirements, against which a developer could build a technical process.
  • Experience testing processes and/or software, and the development of test scripts.
  • Expert in building the most appropriate database and process design, developing code using industry standard data management tools, in this case Databricks (on AWS).
  • Expertise in developing code using established coding languages, specifically Python and ANSI SQL. The ability to understand MS SQL and/or SAS DI Studio is a bonus as legacy products are written in these languages.
  • Focus on user needs and developing solutions in close partnership with customers, using agile approach.
  • Build data products that are fully automated, can be re-used, extended, and deployed as "production quality" services
  • Ability to collaborate with others to build the whole data processing pipeline - from data extract through to finished data products (which could include final extracts, indicator, and publication content.
  • Ability to work against a specification to develop automated and repeatable processes, but equally comfortable to question where specification is ambiguous.
  • Suitable experience in the use of the following toolsets which are key to successfully management of the Buyers outcomes:
  • Jira;
  • Confluence.