Skip to main contentCal State San Bernardino
>> [CNS] >> [Comp Sci Dept] >> [R J Botting] >> [New Bibliographic Items] >> newb0718 [Blog/News] || [Purpose] || [Notation] || [Copyright] || [Site Search] || [Bibliography Search]
Mon Jul 18 08:01:44 PDT 2005

Contents


    Dyba05

    1. Tore Dyba
    2. An Empirical investigation of the Key Factors for success in Software Process Improvement
    3. IEEE Trans Software Engineering V31n5(May 2005)pp341-424
    4. =POLL 120 ORGANIZATIONS IMPROVEMENT SPI STATISTICS
    5. p421: "organizational issues are at least as important in SPI as technology, if not more so."
    6. Results suggest the following,
      1. Align improvement goals with business. Share knowledge between business and software experts.
      2. Management provides resources not leadership in SPI.
      3. Involve the developers/workers in creating the proposed changes and metrics.
      4. Focus on the measurement of success.
      5. Look for existing knowledge to exploit and/or explore for new knowledge
      6. Encourage workers/developers to explore new ideas because this involves them in the process.

    GregoriadesSutcliffe05

    1. Andreas Gregoriades & Alexander Sutcliffe
    2. Scenario-Based Assessment of Nonfunctional Requirements
    3. IEEE Trans Software Engineering V31n5(May 2005)pp392-409
    4. =DEMO TOOL SRA BAYES BN MODEL SYSTEM PERFORMANCE QUALITIES NFR REQUIREMENTS TESTING
    5. Allows for human unreliability in system.
    6. Bayesian network allows the back propagation of high failure rates to likely root causes.
    7. Compare with NASA operational profiles.

    MyrtveitStensrudShepperd05

    1. Ingun Myrtveit & Erik Stensrud & Martin Shepperd
    2. Reliability and Validity in Comparative Studies of Software Prediction Models
    3. IEEE Trans Software Engineering V31n5(May 2005)pp380-391
    4. =SIMULATION =SURVEY RESEARCH EFFORT ESTIMATION COST PREDICTION
    5. Shows that there is no consistent recommendation's in the literature for estimating cost/effort
    6. Simulated the typical research procedure for evaluating and/or comparing ways of predicting the effort needed to produce software.
    7. Shows that the particular measure of goodness of fit chosen determines the "best" model.
    8. Shows that the particular sample of data also determines the "best" model.

    FlanaganFreundQadeer05

    1. Cormac Flanagan & Stephen N Freund & Shaz Qadeer
    2. Exploiting Purity for Atomicity
    3. IEEE Trans Software Engineering V31n4(Apr 2005)pp275-291
    4. =THEORY NONSEQUENTIAL PURE ATOMIC REDUCTION

    OstrandWeyukerBell05

    1. Thomas J Ostrand & Elaine J Weyuker & Robert M Bell
    2. Predicting the location and number of faults in large software systems
    3. IEEE Trans Software Engineering V31n4(Apr 2005)pp340-355
    4. =EMPIRICAL SQA DEFECTS negative binomial
    5. like [OstrandWeyukerBell04]

    PutryczWoodsideWu05

    1. Erik Putrycz & Murray Woodside & Xiuping Wu
    2. IEEE Software Magazine V22n3(Jul/Aug 2005)pp36-43
    3. =ADVERT COMPONENTS PERFORMANCE ARCHITECTURE METHOD LOAD QUEUING MODEL
    4. Not unlike the 1980s Physical Design Control step in SSADM!

    JohnsonEtal05

    1. Philip M Johnson & Hongbin Kou & Michael Palding & Qin Zhang & Aaron Kagawa & Takuya Yamashita
    2. Improving Software development management through Software project Telemetry
    3. IEEE Software Magazine V22n3(Jul/Aug 2005)pp76-85
    4. =DEMO TOOL METRICS AUTOMATION DISPLAY Hackystat
    5. Hackystat::= See http://hackystat.ics.hawaii.edu/.

    Maiden05

    1. Neil Maiden
    2. What has requirements Research ever done for us?
    3. IEEE Software Magazine V22n3(Jul/Aug 2005)pp104-105
    4. =SURVEY requirements KAOS i* Tropos Use-case maps LTSA CREWS ART-SCENE
    5. Sidebar has URLs.

    LemahiueEtal05

    1. Wilfried Lemahieu & Monique Snoeck & Frank Goethala &Manu De Backer & Raf Haesen & Jacques Vandenbulke & Guido Dedene
    2. Coordinating COTS Applications via a Business Event Layer
    3. IEEE Software Magazine V22n3(Jul/Aug 2005)pp28-35
    4. =DEMO ARCHITECTURE BUSINESS PROCESS COTS COMPONENTS MODULES
    5. Proposes coordinating components by using two layers on top of the components. One layer describes a business processing terms of business events. The second layer defines the different business events and notifies various subsets of application components in turn.
    6. Compare with MVC and Larman

    Denning05b

    1. Peter J Denning
    2. The Locality Principle
    3. Commun ACM V48n7(Jul 2005)pp19-24
    4. =SURVEY =HISTORY LOCALITY

    Jackson01

    1. Michael A Jackson
    2. Problem Frames: Analyzing and structuring software development problems
    3. Addison Wesley 2001 ISBN 0-201-59627-X QA76.76 D47 J32 2001
    4. =ESSAY PROBLEM ANALYSIS REQUIREMENTS REALITIES SYSTEMS METHODS
    5. An in depth attempt at deconfusing a critical part of software development: understanding the problems and planning how to tackle them. Includes detailed references to several methods and some classic examples from the literature,
    6. Five basic problem frames: Required Behavior. Commanded Behavior, Information Display, Simple Workpieces, Transformation.
    7. Plus many variants and compositions,
    8. Good discussion of the concerns that arise with different types of problems and domains.
    9. Many examples. New Glossary, Careful definitions of what the diagrams mean,
    10. Distinguishes: the machine, the parts of reality, the requirements, the existence of parts of reality that a symbolic descriptions, ...
    11. Some advice for decomposing a problem into subproblems -- for example introducing a designed model of a part of reality and a machine to keep it synchronized, plus other machines to meet other requirments.

    ChenDiosMiliWuWang05

    1. Yaofei Chen & Rose Dios & Ali Mili & Lan Wu & Kefei Wang
    2. An Empirical Study of Programming Language Trends
    3. IEEE Software Magazine V22n3(May/Jun 2005)pp72-79
    4. =POLL =HISTORY PROGRAMMING LANGUAGES
    5. Gathered data on what developers used, schools taught, and the primary language in companies. Dates: 1993, 1998, 2003
    6. Java is becoming the top language.
    7. Developers appear to value machine independence and extensibility mosts, and dislike simplicity and ease of implementation.

    KeilTiwana05

    1. Mark Keil & Amrit Tiwana
    2. Beyond Cost: The Drivers of COTS Application Value
    3. IEEE Software Magazine V22n3(May/Jun 2005)pp64-69
    4. =POLL MARKET COMPONENTS PURPOSE QUALITIES COST CUSTOMIZATION USER
    5. MIS Managers tend to treat functionality and reliability as more important than cost, ease of customization, and ease of use.

    Glass05

    1. Robert L Glass
    2. IT Failure Rates -- 79% or 10..15%
    3. IEEE Software Magazine V22n3(May/Jun 2005)pp112+110-111
    4. =SURVEY Standish considered harmful
    5. Evidence that the 70% failure rate is mythical. Invites Standish to respond.

    Cukic05

    1. Bojan Cukic
    2. The virtues of assessing reliability early
    3. IEEE Software Magazine V22n3(May/Jun 2005)pp51-53
    4. =ADVERT COMPONENTS QUALITIES RELIABILITY PREDICTION UML MODEL USE CASES SEQUENCE DEPLOYMENT TOOL beta-distribution Monte Carlo ECRA
    5. ECRA::= "extended component-based reliability assessment".
    6. See cukic@csee.wvu.edu.

    KarlstromRuneson05

    1. Daniel Karlstrom & Per Runeson
    2. Combining agile methods with Stage--Gate Project Management
    3. IEEE Software Magazine V22n3(May/Jun 2005)pp43-49
    4. =EXPERIENCES SD SDLC+XP AGILE PROCESS MANAGEMENT ABB Ericsson Vodafone
    5. Engineers are ready for agile processes but managers fear them. Success needs both!
    6. XP leads to better quality.
    7. The SDLC documentation should be treated as another XP User Story.
    8. Agile process handles the day-to-day process of development and lets managers concentrate on the overall progress.

    SchatzAbdelshafi05

    1. Bob Schatz & Ibrahim Abdelshafi
    2. Primavera Gets Agile: A successful transition to agile Development
    3. IEEE Software Magazine V22n3(May/Jun 2005)pp36-42
    4. =EXPERIENCE Scrum AGILE PROCESS =ADVERT TOOL
    5. Consequence: higher morale, customer reported defect/KLOC drops 30%, able to deliver one release 4 months early.
    6. Need for executive sponsor and external coach.
    7. Focus on developing teamwork.
    8. 40.hour week.
    9. Learn to negotiate and set expectations.
    10. Needed a process that takes accusal suggestion by a product owner into a new requirement.
    11. Then added test-driven development and re-emphasized code quality. Defect rate per item dropped 75%!

    Little05

    1. Todd Little
    2. Context-adaptive agility: managing complexity and uncertainty
    3. IEEE Software Magazine V22n3(May/Jun 2005)pp28-35
    4. =PRACTICE ONE SIZE PROJECT -> PROCESS AGILE landmark
    5. Classify projects as Skunks, Dogs, colts,Cows and Bulls.
      ProjectSimpleComplex
      UncertainColtBull
      CertainSkunk/DogCow
    6. A Skunk is typically a prototype but a Dog is a mature project.
    7. Projects change state. 3 common trajectories. Skunk->Colt->Dog, Skunk->Colt-> Bull->Cow->Dog, Bull->Cow->Dog.
    8. Fit the process to the animal. Core practices + adaptions.
    9. Core Practice: product plan,prioritize requirements, quality targets, continuous integration, involve experts users, Project Dashboard.
    10. Project_dashboard:=web-based project status updated weekly.
    11. Colts benefit from short iterations, daily stand up meetings, and automated testing.
    12. Cows need more rigorous management tools and procedures: CPM, subprojects, teams coordinated by Scrum, functional specs of interfaces,...
    13. Bulls need BOTH Colt and Cow practices. They need the best experienced managers.
    14. Has several process and project metrics.

    CeschiEtal05

    1. Martina Ceschi & Alberto Sillitti & Giancarlo Succi & Stefano De Panfilis
    2. Project Management in plan based and agile companies
    3. IEEE Software Magazine V22n3(May/Jun 2005)pp21-27
    4. =POLL 20 MANAGERS PROCESS SATISFACTION AGILE
    5. All use some form of incremental delivery and face similar problems with deadlines and changing requirements,
    6. Some nonsignificant evidence that Agile companies tend to be more satisfied with project planning and customer relationships.
    7. 50% of both want developers that can work in a group.

    Thomas05

    1. Dave Thomas
    2. Agile Programming: Design to accommodate change
    3. IEEE Software Magazine V22n3(May/Jun 2005)pp14-16
    4. =ADVERT TABULAR decision tables states FSM/STD Spreadsheets
    5. Convert changing code into data + table and let the users supply the data.

    Harrison05

    1. Warren Harrison
    2. Skinner wasn't a Software Engineer
    3. IEEE Software Magazine V22n3(May/Jun 2005)pp5-7
    4. =EDITORIAL RESEARCH
    5. Notes problems with experiments (on students) and field studies.
    6. Describes single subject research designs.
    7. design::=baseline #(treatment withdraw).
    8. One subject implies no means. Looking for an observable change instead. therapeutic criterion

    PotaninNobleFreanBiddle05

    1. Alex Potanin & James Noble & Marcus Frean & Robert Biddle
    2. Scale-Free Geometry in OO Programs
    3. Commun ACM V48n5(May 2005)pp99-103
    4. =STATISTICS Object-Oriented run-time heap data digraph
    5. Analyzed the heap of 9 large OO programs.
    6. Number of objects containing n pointers is roughly proportional to n ^ -3.
    7. Number of objects with n pointers pointing at them is roughly proportional to n ^ -2.5.
    8. Tendency for objects with many incoming pointers to have few outgoing pointers, and vice versa.
    9. Notes

    SommervilleRansom05

    1. Ian Sommerville & Jane Ransom
    2. An empirical study of industrial requirements engineering process assessment and improvement
    3. ACM TOSEM Trans Software Eng & Methodology V14n1(Jan 2005)pp85-117
    4. =EXPERIENCE 9 Greek IMPROVEMENT REQUIREMENTS ENGINEERING MATURITY TOOL IMPRESSION CMM CMMI REAIMS
    5. 3 levels. Initial; repeatable; defined.
    6. 66 good practice guidelines. 36 basic, 21 intermediate, 9 advanced. In 8 areas of requirements engineering. ....
    7. Scoring. never, discretionary, normal, standardized.
    8. Others studied business performance measurement.
    9. Greek companies, with English methodologists, so companies did not use techniques invented by the experimenters.
    10. At first, all companies didn't have a repeatable process, but some used intermediate and advanced practices anyway.
    11. Experimenters suggested guidelines to be targeted.
    12. All companies improved RE levels.
    13. Also looked at business results: critical success factors (CSF), key performance indicators (KPI). All companies showed improvements.
    14. Company reps said it was worth doing.
    15. Companies did not know what improvements to make.
    16. Classification of guidelines is domain dependent.
    17. The tool was worth developing: saved data entry time etc.

End