Skip to main contentCal State San Bernardino
>> [CNS] >> [Comp Sci Dept] >> [R J Botting] >> [New Bibliographic Items] >> newb0213 [Blog/News] || [Purpose] || [Notation] || [Copyright] || [Site Search] || [Bibliography Search]
Mon Feb 14 16:34:06 PST 2005

Contents


    LiShawHerbslebRaySanthnanam04

    1. Paul Luo Li & Mary Shaw & Jim Herbsleb & Bonnie Ray & P Santhnanam
    2. Empirical evaluation of Defect Projection models for widely-deployed production software systems
    3. Proc SIGSOFT'04/FSE-12& ACM SIGSOFT Software Engineering Notes V29n6(Nov 2004)pp263-272
    4. =EMPIRICAL STATISTICS DEFECTS COTS Tomcat OpenBSD Weibull
    5. Gathered data about user reported defects in open and closed source code projects.
    6. Rate of defects in each release vs time in release.
    7. Found that Weibull work better than most models with Gamma a rival.
    8. For Time t, defect_rate(t)::= N* \alpha* \beta * t**(\alpha-1) * exp(\beta * t**\alpha).
    9. Did not find a simple way to predict the parameters of each release.

    SouzaRedmillesChengMillenPatterson04

    1. Cleidson R B de Souza & David Redmilles & LI-Te Cheng & David Millen & John Patterson
    2. How a good software practice thwarts collaboration -- the multiple roles of APIs in Software Development
    3. Proc SIGSOFT'04/FSE-12& ACM SIGSOFT Software Engineering Notes V29n6(Nov 2004)pp221-229
    4. =OBSERVATION PRACTICE TEAM MODULES INTERFACE Java =IDEA SOCIAL Call-graph
    5. Interfaces(public and published classes interfaces and methods) were an important part of development in this project acting as a contract, supporting independent work, following organizational boundaries and forming a language for dependencies.
    6. However: APIs were unstable, incompletely implemented, and stopped important information being shared.
    7. Tools were not use to track and share changes in APIs.
    8. Proposal: use a tool to analyze code and track (1) call-graph dependencies across APIs + (2) who is working on which. Hence track who depends on whom and advertise the need to communicate.

    EdwardsJacksonTorlak04

    1. Jonathan Edwards & Daniel Jackson & Emina Torlak
    2. A Type System for Object Models
    3. Proc SIGSOFT'04/FSE-12& ACM SIGSOFT Software Engineering Notes V29n6(Nov 2004)pp189-199
    4. =IDEA LOGIC RELATIONAL TYPES Alloy 2.0
    5. Defines an improved type system for Alloy with subtypes, relations, etc.
    6. Compares with UML OCL.
    7. Precisely defines two types for formulas: bounding type and relevance type.
    8. An empty relevance type indicates an error.

    LevesonWeiss04

    1. Nancy G Leveson & Kathryn Anne Weiss
    2. Making embedded software reuse practical and safe
    3. Proc SIGSOFT'04/FSE-12& ACM SIGSOFT Software Engineering Notes V29n6(Nov 2004)pp171-
    4. =EXAMPLES RISKS REUSE QUALITIES SAFETY EVOLUTION INTENT DOCUMENTATION DESIGN DECISIONS NASA MCO Ariane SOHO TOOL SpecTRM And/Or TABULAR SPHERES PAD FSA TCAS
    5. Accidents caused when undocumented assumptions made by reused components ceased to be true.
    6. Change happens and documentation helps to trace the changes to the components that are no longer safe to (re)use.
    7. Generic libraries of intent specification for SPHERES project with a single experimental reuse...
    8. Need to document the WHY at half-a-dozen levels.
    9. Claims that OO requirements analysis can not produce safely reusable components because requirments are distributed across many object.
    10. However OO design of components proved to meet documented requirements may be safe to reuse.

    MaziniOsterman04

    1. Mira Mazini & Klaus Osterman
    2. Variability Management with Feature-Oriented Programming and Aspects
    3. Proc SIGSOFT'04/FSE-12& ACM SIGSOFT Software Engineering Notes V29n6(Nov 2004)pp127-136
    4. =ADVERT Caesar TECHNICAL VARIABILITY FEATURES vs ASPECTS AspectJ FOA
    5. Feature oriented programming allows the coding of refinements to existing base classes in layers. They may be mixed. This is implicitly hierarchical and the hierarchy may not fit other structures. They don't handle changes to the code that are cross-cutting concerns that impact many methods.
    6. Aspects handle cross cutting concerns (by pointcut+advice) but can not form hierarchies.
    7. Caesar::language=combines layers and aspects....
    8. Will be applied in [ http://www.topprax.de/ ] the TOPPrax project.

    Capra04

    1. Licia Capra
    2. Engineering Human Trust in Mobile System Collaborations
    3. Proc SIGSOFT'04/FSE-12& ACM SIGSOFT Software Engineering Notes V29n6(Nov 2004)pp107-116
    4. =THEORY MATHEMATICS TRUST AGENTS hTrust TMF
    5. TMF::="Trust Management Framework".
    6. Models the formation, dissemination and evolution of trust between agents.
    7. trust_data::=Net{who_trusts, opinion, trusted, level, subject, direct_experiences, credentials, recommendations }.
    8. Notes

    ZitserLippmannLeek04

    1. Misha Zitser & Richard Lippmann & Tim Leek
    2. Testing static analysis tools using exploitable buffer overflows from open source code
    3. Proc SIGSOFT'04/FSE-12& ACM SIGSOFT Software Engineering Notes V29n6(Nov 2004)pp97-106
    4. =EXPERIMENT open source SECURITY TOOLS Boon Archer Uno Splint PolySpace C sendmail BIND WU-FTPD
    5. Static analysis tools miss known buffer overflows and misdiagnose safe programs.
    6. PolySpace best.
    7. Main problem: content of arrays makes code safe/unsafe..

    LetierLamsweerde04

    1. Emmanuel Letier & Axel van Lamsweerde
    2. Reasoning about partial goal satisfaction for requirements engineering.
    3. Proc SIGSOFT'04/FSE-12& ACM SIGSOFT Software Engineering Notes V29n6(Nov 2004)pp53-62
    4. =DEMO QUALITIES PROBABILITY GOALS PCTL

    DingelLiang04

    1. Juergen Dingel & Hongzhi Liang
    2. Automating comprehensive safety analysis of concurrent programs using VeriSoft and TXL
    3. Proc SIGSOFT'04/FSE-12& ACM SIGSOFT Software Engineering Notes V29n6(Nov 2004)pp13-22
    4. =DEMO SQA AUTOMATED MODEL CHECKING TOOL ViP TXL plLTL Verisoft

    Wu04

    1. Fangjun Wu
    2. Empirical Analysis of Entropy distance Metric for UML Class Diagrams
    3. ACM SIGSOFT Software Engineering Notes V29n5(Sep 2004)p35 [ 1022494.1022524 ]
    4. =EMPIRICAL UML METRIC Zhou banking understanding
    5. Abstract: "[...]we provide empirical evidence for supporting the role of the structure complexity metrics for UML class diagrams, specifically Zhou's metric. Our results, based on data related with bank information system, indicate that the metric is basically consistent with human beings' intuitions. "

    Holzinger04

    1. Andreas Holzinger
    2. Usability Engineering methods for Software developers
    3. Commun ACM V48n1(Dec 2005)pp71-74
    4. =REFERENCE USER QUALITIES MEASURE TEST HCI SQA INSPECTION
    5. usability::=(learnability, efficiency, memorability, low error rate, satisfaction).
    6. usability_inspection_methods::= heuristic_evaluation + cognitive_walkthrough + action_analysis.
    7. action_analysis =~= time_and_motion_analysis.
    8. usability_test_methods::= thinking_aloud + field_observation + questionaires .
    9. matrix on p 72.

    Turchin03

    1. Peter Turchin
    2. Historical Dynamics: Why States rise and fall
    3. Princeton UP (complexity) 2003? D16.25 T87 ISBN 0-691-11669-5 $35
    4. =MATHEMATICS HISTORY

    HirschheimNewman91

    1. Rudy Hirschheim & Mike Newman
    2. Symbolism and Information System Development: Myth, Metaphor, and magic
    3. Information Systems Research V2n1(??? 1991) & [MyersAvisson02]
    4. =SURVEY POSTMODERN PEOPLE SYSTEMS ANTHROPOLOGY CYBERCRUD
    5. Ref to [Markus83]
    6. Argues that information system development can not described as a rational endeavor and gives examples from 4 projects of myths, metaphors, and Magic (ritual).
    7. Distinguishes believers(faith overrides events) from cynics( belief follows events)
    8. 6 myths: user involvement, resist and always happens and must be overcome, integration is good, system people know best, politics doesn't matter, top-down is good.
    9. 3 metaphors: it's a battle, the organization is divided into fiefdoms, people are machines.
    10. 3 magic rituals: involving the user, signing off, owning data.

End