Child pages
  • CPTAC Meeting Agendas & Notes
Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 64 Next »

March 19, 2018

  1. Review draft agenda for May F2F
  2. Decide additional attendees
  3. (Fire)Cloud costs for CPTAC-wide usage:  $25K seems to have effectively been reduced
  4. Batches 1 and 2 of genomic data are located at   /xchip/gdac_data/cptac3/genomic_data_mirror

    1. WashU has apparently re-submitted batch1 RNASeq data, using MapSlice to map transcripts to gene level

    2. So we should be able, in principle, to proceed with our mRNA pipelines

  5. Mike has a short, unifying wrapper to all 3 of the DCC upload/download utilities, and can install to Unix server upon request
  6. More?

March 5, 2018

  1.  Summation of HG19  WashU/GDC workaround & potential recommendations to CPTAC leadership & collaborators.

    1. Consider: combing the GDC website to see if Dockers are available for their pipelines, and whether these could be instantiated in FC
    2. Consider: running local MOAP-style pipeline on WXS data, to generate CN, mutation, RNASeq
      1. Decision:  wait for now, it's not fully baked yet
  2. TODO:  
    1. open edit permissions (on this page) to all viewers
    2. Identify 2-3 integrative proteo-genomic analyses: but must be on CPTAC3 data
    3. Combined into iCoMut output
  3. Planning F2F in May 1,2,3:
    1. Quilts in FC?
    2. Although CustomProDB (from Baylor/Bing Zhang group) does similar as Quilts and is already in FC (from Karsten)
    3. FireCloud workshop
    4. Karsten & Mani: current instantiation of proteomic pipeline 
      1. On prospective BRCA data

Update to Jan 22, 2018 entry:

  1. Genomic data for CPTAC3 downloaded to:  /xchip/gdac_data/cptac3/2018_02_02_genomic_data

  2. Only 2 cohorts (CRCC and UCEC, i.e. kidney and endometrial) have genomic data available so far
  3. The 3rd cohort (LUAD, lung) proteomic data not submitted by Broad yet, so WashU has apparently not processed the genomic either

Jan 22, 2018

  1. Brief review of items missed from last meeting
  2. Proteogenomic Data Commons Steering Committee: 
    1. Held 2nd advisory meeting call last Wed
  3. WashU/GDC workaround: summary of discussion & decisions from 1/19 call

  4. New science: degradome?

Jan 8, 2018

  1. Welcome Yifat Geffen, newest member of CGA
  2. Brief review of latest suite of genomic run reports (total of 830)
  3. Whither pathology image browser in CPTAC?  The GTEX pathology browser was authored here (and we have strong knowledge of cancer path viewer), so we have a good deal of expertise & code that could in principle be leveraged.  I've drafted a suggestion for PAAD dwg here.
  4. NMF clustering module question (auto-selection of K) from Mani?
  5. FireCloud hosting of CPTAC data (as partial workaround to lack of CPTAC genomic data at GDC)
  6. Medblast paper

Dec 11, 2017

  1.  Items from 11/27 meeting that was cancelled
  2. GDC and CPTAC:  summary notes from week of  2017_12_06
    1. Original plan (and data products) given here
    2. Impact to CGDAC (the CGA part of proteomics GDAC) sketched below
    3. Initial data generation will be shifting from GDC to WashU
          Mutation calls (both WES and WGS)
          CN generation
          RNAseq calls
      WashU products deposited to Georgetown DCC
      Broad download & remap names as needed/appropriate

    4. What's next?
      FireCloud (as a trusted partner) now being considered as a distribution point
      Per Chris Kinsinger feeler conversation on 2017_12_01

    5. So, because these data will be HG19 ... our CGA/GDAC in CPTAC may be better utilized by shifting gears, from running existing FireCloud HG38 genomic pipelines on HG19 data (which lead to broken results) ... to loading these HG19 data products from WashU into FireCloud so that it can serve as a distribution point

    6. Side Q: why Georgetown DCC not considered for this? Scale? Absence of trusted partner status?

  3. Status on $25K to fund use of FireCloud across entire CPTAC?
    1. any progress: NO, there was an attempt to issue as AWS credits ... currently stuck w/r/t GoogleCredits ... stay tuned
    2. billing project?

Nov 27, 2017

  1.  Timeline for LUAD, UCEC and KIRC projects:  given here

Oct 30, 2017: tentative agenda

  1. Discuss CPTAC-wide use of FireCloud: how to allot funding, make billing projects, add users etc
  2. Recall supervisor modein FISSFC:
  3. Update on DSDE collaboration:
    1. Show recent CGA/DSDE collaboration proposal
    2. FISS backbone of Jupyter notebooks in FireCloud
    3. Code generator progress:
      1. standalone tool
      2. works on GTEX
      3. Swagger2 / FireCloud proof of concept has been done
      4. Full Swagger2 support is next
  4. Discussion for Wed 11/1 AWG telecon:
    1. Thoughts omitted from F2F talk, for time constraint: slides 21-39
  5. Chet: recent CPTAC2 workspaces ... where to go next?

July 2017:  FYI on proteomics deliverables from FireCloud CGA team

Per Chet Birger/D.R. Mani meeting:
  • FireCloud data workspaces
    • one (possibly two - see below) for each of the three CPTAC AWGs (breast, ovarian, colon)
    • The workspaces will contain, at a minimum, the end results (protein level quantification) produced by each AWG.
    • We may also include the raw MS files, and/or the standardized mzML files.  But all of the pipelines used for analyzing these files rely on windows-based software, and so cannot be run on FireCloud. 
    • We will include the TCGA genomic, clinical and biospecimen data as well - this will help researchers who want to conduct correlative analyses.  It will mean, however, that we'll want to create both open and controlled access versions of these workspaces, as the BAMs and VCF files are controlled access. 
    • We may also include the outputs of the CDAP pipeline, which are published on the CPTAC data portal.  
    • We will aim to get these workspace in place by the end of August
  • Workflows
    • Since all of the workflows that run on either the raw MS files or the mzML files (CDAP) include windows-based tasks, they cannot be run on firecloud.
    • Mani and Mike's teams are developing workflows for correlative analysis; we agreed to touch base with them at the end of August to see how far along any of these pipelines are and whether they could be included in our deliverables.  If not, so be it....I'm hoping that NCI will see the value in the data workspaces for the future development of workflows.

May 31, 2017 On-Site (Broad Institute, Cambridge MA)

  • Mike's slides:  here

April 4, 2017 Face-to-Face (Bethesda, MD)

  • Mike's slides:  here
  • No labels