All posts by benjdavies

Benjamin Davies is a researcher at the University of Auckland. His research applies spatial simulation and agent-based modeling methods to questions of human mobility, demography, and archaeological site formation. He works primarily in Australia and the Pacific Islands. Follow on twitter @ba_davies

CAA 2016 Session Videos

Continuing on the video theme: awhile back we encouraged folks to attend this year’s Computer Applications in Archaeology conference in Oslo. It was a blast to attend, and Oslo is a really cool city to spend a week in. I even briefly considered staying on to start a career doing car advertisements..

carmodel

 

However, if you weren’t able to make it up to Oslo, Doug Rocks-Macqueen, author of the excellent blog Doug’s Archaeology, has you covered: his session recordings have been making their way out on to the interwebs via his YouTube channel, Recording Archaeology. Now you can relive all of the action of CAA Oslo right in your own home!

Here’s a few of the sessions, helpfully organized as playlists of individual talks:

Linked pasts: Connecting islands of content

Methodology of archaeological simulation. Meeting of the Special Interest Group in Complex Systems Simulation

The road not taken: Modelling approaches to transport on local and regional scales

Can you model that? Applications of complex systems simulation to explore the past

Networking the past: Towards best practice in archaeological network science

Theorising the Digital: Digital Theoretical Archaeology Group (digiTAG) and the CAA

Interpretations from digital sensations? Using the digital sensory turn to discover new things about the past

For more videos, check out Recording Archaeology. And don’t forget to register for CAA 2017 in Atlanta!

 

CfP: Computer Applications in Archaeology, March 14 – 17, Atlanta, GA USA

The folks at CAA have recently announced a call for papers for the 2017 conference, to be held at Georgia State University in Atlanta. From the conference website:

The 45th CAA conference will bring together scholars from across the globe to share their cutting edge research from a diverse range of fields in a focused, but informal, setting.  One thing that the CAA prides itself on is a strong sense of community, and we hope to continue to grow that community by welcoming new participants this year.  This is only the 3rd time the conference has been held in the United States, and we are excited to have old and new members join us in Atlanta this coming spring.

There are a TON of sessions to choose from this year, showcasing the diversity of computational approaches in archaeology as well as interest in theory and ways of knowing. The full list of sessions is here.

The authors of this blog will be co-chairing a few different sessions at the conference, including:

 

Data, Theory, Methods, and Models. Approaching Anthropology and Archaeology through Computational Modeling

Quantitative model-based approaches to archaeology have been rapidly gaining popularity. Their utility in providing an experimental test-bed for examining how individual actions and decisions could influence the emergence of complex social and socio-environmental systems has fueled a spectacular increase in adoption of computational modeling techniques to traditional archaeological studies. However, computational models are restricted by the limitations of the technique used, and are not a “silver bullet” solution for understanding the archaeological and anthropological record. Rather, simulation and other types of formal modeling methods provide a way to interdigitate between archaeology/anthropology and computational approaches and between the data and theory, with each providing a feedback to the other. In this session we seek well-developed models that use data and theory from the anthropological and archaeological records to demonstrate the utility of computational modeling for understanding various aspects of human behavior. Equally, we invite case studies showcasing innovative new approaches to archaeological models and new techniques expanding the use of computational modeling techniques.

Everything wrong with…

This is a different kind of session. Instead of the normal celebration of our success this session will be looking at our challenges. But, not degrading into self-pity and negativity, as it will be about critical reflection and possible solutions. The goal of this session is to raise the issues we should be tackling. To break the mold of the typical conference session, in which we review what we have solved, and instead explore what needs to be solved. Each participant will give a short (max 10 minutes but preference will be for 5 mins.) presentation in which they take one topic and critically analysis the problems surrounding it, both new and old. Ideally, at the end each participant would have laid out a map of the challenges facing their topic. The floor will then be opened up to the audience to add more issues, refute the problems raised, or propose solutions. This is open to any topic- GIS, 3D modelling, public engagement, databases, linked data, simulations, networks, etc. It can be about a very narrow topic or broad ranging e.g. everything that is wrong with C14 dating, everything wrong with least cost path analysis in ArcGIS, everything wrong with post-prossussalism, etc. However, this is an evaluation of our methods and theories and not meant to be as high level as past CAA sessions that have looked at grand challenges e.g. the beginning of agriculture. Anyone interested in presenting are asked to submit a topic (1-2 sentences) and your estimated time to summarize it (5 or 10 minutes). Full abstracts are not necessary.

The ups and downs of archaeological simulation

The continuing rise of computational modelling applications, in particular simulation approaches, resembles the ‘hype’ cycles our discipline experienced in the past. The introduction of statistics, data management or GIS all started with inflated expectations and an explosion in applications, followed by a ‘correction’ phase seeing the early optimism dwindling and a heavy critique towards exaggerated claims and examples of misapplication. The next phase, ‘maturity’, is reached when the use of a particular technique is not questioned any more (although particular applications of it may still be) as it becomes part of the standard research toolkit. The verdict is still out whether the use of simulation techniques in archaeology is reaching the peak of the ‘optimism’ phase or is perhaps still in the midst of the ‘correction’ phase. However, lessons learned from other, now commonly used, computational methods or coming from other disciplines could accelerate the process of establishing simulation in the mainstream of archaeological practice. The Special Interest Group in Complex System Simulation would like to open the discussion to a wide audience of archaeologists and therefore invites all CAA2017 participants to take an active part in the roundtable. During the meeting we will consider the current place of simulation in archaeological practice, the main challenges facing modellers and the road map for the future.

The conference promoters are also looking for folks interested in putting together workshops for the day before the session. The deadline for abstract submissions is midnight on Friday, October 28th. For more information, visit the CAA conference website.

Featured image: Midtown HDR Atlanta by Mmann1988 (Wikimedia Commons CC BY 2.0)

 

Simulados: a short video explaining what ABM is and how we use it to understand the past

This video, brought to you by our friends over at the Barcelona Supercomputing Center, does a great job of explaining in easy-to-understand terms what agent-based modeling is, and how it can be useful for both understanding the past and making the past relevant to the present. No small feat to accomplish in about 3 minutes. Have a look!

CFP: Computational Social Science Society of the Americas, Santa Fe, Nov 17-20

The CSSSA will be hosting its annual conference in November, bringing researchers from all stripes of computational social science together in beautiful Santa Fe, New Mexico. According to the website, some of the topics to be discussed at the meeting include (but are not limited to):

  • Social network analysis
  • Agent-based models / modeling
  • Emergence
  • Economic models / resource allocation
  • Population dynamics
  • Ecosystems
  • Political/social systems
  • Biological systems / metabolism / bioenergetics
  • Efficiencies / fitness functions
  • Competition / cooperation
  • Networks / information flow
  • Social contagion
  • Vision / knowledge acquisition
  • Influence
  • Swarm intelligence
  • Adaptation / evolution
  • Decision making
  • Local knowledge / global patterns
  • Game theoretic models
  • Strategy
  • Learning

Applications close August 15th, 2016. For more information, check out the CSSSA website.

Image source: Wikimedia Commons/U.S. Public Domain

CFP: SwarmFest 2016, Burlington, VT Jul 31 – Aug 3

Possibly the longest running meeting on agent-based modeling, SwarmFest, is being held this July at the University of Vermont campus in Burlington. Now in its 20th(!) year, SwarmFest brings together people from a range of backgrounds in ABM and simulation. From the website:

SwarmFest is the annual meeting of the Swarm Development Group (SDG), and one of the oldest communities involved in the development and propagation of agent-based modeling.  SwarmFest has traditionally involved a mix of both tool-users and tool-developers, drawn from many domains of expertise.  These have included, in the past, computer scientists, software engineers, biomedical researchers, ecologists, economists, political scientists, social scientists, resource management specialists and evolutionary biologists.  SwarmFest represents a low-key environment for researchers to explore new ideas and approaches, and benefit from a multi-disciplinary environment.  

Given the concentration of computational and complexity labs at UVM, this promises to be a very exciting meeting. And summertime is a fantastic time to be on Lake Champlain, or really any lake in New England, so I wholeheartedly recommend the trek to Burlington.

Call of abstracts closes June 15th, so get in quickly. For more info, see the website.

CFP: Interactive Pasts conference, Leiden April 4-5 2016

People play video games, archaeologists included. People are spending more and more time in the virtual worlds presented by video games, raising the question of how our digital past is to be studied or curated. And video games are often constructed within historical frames, whether characters are fighting dysentery on the Oregon Trail or fighting mutants in a post-apocalyptic Boston. Video games offer a window into historical process and narrative-building that more passive media cannot.

There is a growing contingent of archaeologists and historians who are using and exploring video games as both media for portraying the past (or pasts), as well as a valuable source of information on the digital lives of humans in the more recent past. Greater historical detail in games also suggests a role for archaeologists in the development of games.

Enter Interactive Pasts: a conference bringing together these disparate interests. From the website:

This ARCHON-GSA conference will explore the intersections of archaeology and video games. Its aim is to bring scholars and students from archaeology, history, heritage and museum studies together with game developers and designers. The program will allow for both in-depth treatment of the topic in the form of presentations, open discussion, as well as skill transference and the establishment of new ties between academia and the creative industry.

If you’re already going to be on the road for the CAA conference in Oslo, this conference conveniently begins right afterwards in Leiden. Abstracts are due on the 31st, and more information can be found here.

Building a Schelling Segregation Model in R

Happy New Year! Last year, our good friend Shawn over at Electric Archaeology introduced us to an excellent animated, interactive representation of Thomas Schelling’s (1969) model of segregation called “Parable of the Polygons”. I’ve always liked Schelling’s model because I think it illustrates the concepts of self-organization and emergence, and is also easy to explain, so it works as a useful example of a complex system. In the model, individuals situated in a gridded space decide whether to stay put or move based on a preference for neighbours like them. The model demonstrates how features of segregated neighborhoods can emerge even when groups are relatively ‘tolerant’ in their preferences for neighbors.

Here, I’ve created a simple version of Schelling’s model using R (building on Marco Smolla’s excellent work on creating agent-based models in R). Schelling’s model is situated on a grid, and in its simplest form, the cells of the grid will be in one of three states: uninhabited, inhabited by a member of one group, or inhabited by a member of a second group. This could be represented as a matrix of numbers, with each element being either 0, 1, or 2. So we’ll bring together these components as follows:

number<-2000
group<-c(rep(0,(51*51)-number),rep(1,number/2),rep(2,number/2))
grid<-matrix(sample(group,2601,replace=F), ncol=51)

par(mfrow=c(1,2))
image(grid,col=c("black","red","green"),axes=F)
plot(runif(100,0,1),ylab="percent happy",xlab="time",col="white",ylim=c(0,1))

Here, we start with a 51 x 51 grid of 2000 occupied cells.  To create this, a vector called group is generated that contains 1000 1s, 1000 2s, and the remainder are 0s. These are collated into a matrix called grid through random sampling of the group vector. Finally, the matrix is plotted as an image where occupied cells are colored green or red depending on their number while unoccupied cells are colored black, like so:

Rplot01
The next step is to establish the common preference for like neighbors, and to setup a variable which tracks the overall happiness of the population.

alike_preference<-0.60
happiness_tracker<-c()

Finally, we’ll need a function which we’ll use later to establish who the neighbors are for a given patch, which we will call get_neighbors. To do this, we’ll feed the function a set of two xy-coordinates as a vector (e.g. 2 13 ), and using a for-loop, pull each neighbor with the Moore neighborhood (8 surrounding patches) in order, counterclockwise from the right. Then we’ll need to ensure that if a neighboring cell goes beyond the bounds of the grid (>51 or <1), we account for this by grabbing the cell at the opposite end of the grid. This function will return eight pairs of coordinates as a matrix.

get_neighbors<-function(coords) {
  n<-c()
  for (i in c(1:8)) {
 
    if (i == 1) {
      x<-coords[1] + 1
      y<-coords[2]
    }

    if (i == 2) {
      x<-coords[1] + 1
      y<-coords[2] + 1
    }
  
    if (i == 3) {
      x<-coords[1]
      y<-coords[2] + 1
    }
    
    if (i == 4) {
      x<-coords[1] - 1
      y<-coords[2] + 1
    }
    
    if (i == 5) {
      x<-coords[1] - 1
      y<-coords[2]
    }
    
    if (i == 6) {
      x<-coords[1] - 1
      y<-coords[2] - 1
    }
   
    if (i == 7) {
      x<-coords[1]
      y<-coords[2] - 1
    }
    
    if (i == 8) {
      x<-coords[1] + 1
      y<-coords[2] - 1
    }
   
    if (x < 1) {
      x<-51
    }
    if (x > 51) {
      x<-1
    }
    if (y < 1) {
      y<-51
    }
    if (y > 51) {
      y<-1
    }
    n<-rbind(n,c(x,y))
  }
  n
}

Now to get into the program. We’ll run the process 1000 times to get output, so the whole thing will will be embedded in a for loop. Then, we’ll set up some variables which keep track of happy versus unhappy cells:

for (t in c(1:1000)) {
happy_cells<-c()
unhappy_cells<-c()  

Each of these tracker vectors (happy_cells and unhappy_cells) will keep track of additional vectors that contain coordinates of cells that are happy or unhappy.

Next, we’ll use two for loops to iterate through each row and column in the matrix. For each cell (here called current), we’ll take its value (0, 1, or 2). If the cell is not empty (that is, does not have a value of 0), then we’ll create variables that keep track of the number of like neighbors and the total number of neighbors (that is, neighboring cells which are inhabited), and then we’ll use our get_neighbors function to generate a vector called neighbors . Then we’ll use a for loop to iterate through each of those neighbors, and compare their values to the value of the current patch. If it is a match, we add 1 to the number of like neighbors. If it is inhabited, we add 1 to the total number of neighbors (a varable called all_neighbors). Then we divide the number of like neighbors by the total number of neighbors, and compare that number to the like-neighbor preference to determine whether the current patch is happy or not (The is.nan function is used here to escape situations where a cell is completely isolated and thus would involve division by 0). Happy patches are added to our happy patches variable, while unhappy patches are added to our unhappy patches variable, both as matrices of coordinates.

for (j in c(1:51)) {
  for (k in c(1:51)) {
    current<-c(j,k)
    value<-grid[j,k] 
    if (value > 0) {
      like_neighbors<-0
      all_neighbors<-0
      neighbors<-get_neighbors(current)
      for (i in c(1:nrow(neighbors))){
        x<-neighbors[i,1]
        y<-neighbors[i,2]
        if (grid[x,y] > 0) {
          all_neighbors<-all_neighbors + 1
        }
        if (grid[x,y] == value) {
          like_neighbors<-like_neighbors + 1
        }
      }
      if (is.nan(like_neighbors / all_neighbors)==FALSE) {
        if ((like_neighbors / all_neighbors) < alike_preference) {
            unhappy_cells<-rbind(unhappy_cells,c(current[1],current[2]))
        }
          else {
            happy_cells<-rbind(happy_cells,c(current[1],current[2]))
          }
        }
   
      else {
        happy_cells<-rbind(happy_cells,c(current[1],current[2]))
      }
    }
  }
}

Next, we’ll get our overall happiness by dividing the number of happy cells by the total number of occupied cells, and update our happiness tracker by appending that value to the end of the vector.

happiness_tracker<-append(happiness_tracker,length(happy_cells)/(length(happy_cells) + length(unhappy_cells)))

Next, we’ll get our unhappy patches to move to unoccupied spaces. To do this, we’ll randomly sample unhappy cells so we’re not introducing a spatial bias. Then, we’ll iterate through that sample, calling each patch in that group a mover, and picking a random spot in the grid as a moveto. A while loop will continue to pick a new random moveto while the current moveto is inhabited. Once an uninhabited moveto has been found, the mover’s value is applied to that patch, and removed from the original mover patch.

rand<-sample(nrow(unhappy_cells))
for (i in rand) {
  mover<-unhappy_cells[i,]
  mover_val<-grid[mover[1],mover[2]]
  move_to<-c(sample(1:51,1),sample(1:51,1))
  move_to_val<-grid[move_to[1],move_to[2]]
  while (move_to_val > 0 ){
    move_to<-c(sample(1:51,1),sample(1:51,1))
    move_to_val<-grid[move_to[1],move_to[2]]
  }
  grid[mover[1],mover[2]]<-0
  grid[move_to[1],move_to[2]]<-mover_val
}

Finally, we’ll check the output.

par(mfrow=c(1,2))
image(grid,col=c("black","red","green"),axes=F)
plot(runif(100,0,1),ylab="percent happy",xlab="time",col="white",ylim=c(0,1))
lines(happiness_tracker,oma = c(0, 0, 2, 0),col="red")
}

With the for loop we created around the whole program, we get animation in our graphical display. Here’s what we get when we set the alike_preference value to 70 percent:animation1

And here’s what happens when it’s set to 72 percent:

animation2

Finally, here’s what happens when it’s set to 80:

animation3

For comparison, check out this version in NetLogo or this HTML version.

Call for Papers: Computer Applications in Archaeology, Oslo, March 29 – April 2 2016

The folks at CAA have issued a call for papers for next year’s conference in Oslo. The conference theme is Exploring Oceans of Data, befitting the maritime heritage of the host city. There are a number of exciting sessions planned, including a one organised by we, your friendly neighborhood SimulatingComplexiteers:

Can You Model That? Applications of Complex Systems Simulation to Explore the Past

The large scale patterns that we commonly detect in the archaeological record are often not a simple sum of individual human interactions. Instead, they are a complex interwoven network of dependencies among individuals, groups, and the environment in which individuals live. Tools such as Agent-based Modelling, System Dynamics Models, Network Analysis and Equation-based Models are instrumental in unravelling some of this network and shedding light on the dynamic processes that occurred in the past. In this session we invite case studies using computational approaches to understand past societies. This session will showcase the innovative ways archaeologists have used simulation and other model building techniques to understand the interactions between individuals and their social and natural environments. The session will also provide a platform to discuss both the potential and the limitations of computational modelling in archaeology and to highlight the range of possible applications.

There are also a number of other amazing looking sessions. Here’s just a few:

  • Networking the past: Towards best practice in archaeological network science
  • Using GIS Modeling to Solve Real-World Archaeological Problems
  • Exploring Maritime Spaces with Digital Archaeology: Modelling navigation, seascapes, and coastal spaces
  • Analyzing Social Media & Online Culture in Archaeology
  • Modelling approaches to analyse the socio-economic context in archaeology II: defining the limits of production
  • Computational approaches to ancient urbanism: documentation, analysis and interpretation

Personally, I can’t think of a better way to spend a few days than talking computers and archaeology in lovely Oslo. For more information or to submit an abstract, visit the CAA conference website.