[Buildbot-devel] Builder-Results
Vasily
vasslitvinov at pisem.net
Wed Aug 6 19:53:29 UTC 2014
Hi there,
Sorry for delay.
Attached are some files that should be working (though keep in mind that
those aren't tested as I had to edit them to delete non-general stuff).
Main idea is - if you're running stuff locally you import your master
configuration (usually "master.cfg") somehow, then import bb_data_provider
and instantiate BuildbotDataProvider with bbConfig set to imported
master.cfg.
If you want to run stuff remotely you set up some JSONRPC service
(implementation not included here) that runs locally using
BuildbotDataProvider and connect to it using bb_jsonrpc_connector.py.
Thanks,
Vasily
2014-07-10 0:49 GMT+04:00 Nachaat Hassis <nachaat05 at yahoo.fr>:
> Hello,
> Yes im interested.
> It would be very helpful for me.
>
> Greats.
> Nach
>
> Am 09.07.2014 um 22:46 schrieb Vasily <vasslitvinov at pisem.net>:
>
> I think I can share a code that we built for 0.8.0 (and updated for
> 0.8.7p1) that extracts some information from DB and pickles about builds.
>
> Is anyone interested?
>
> Thanks,
> Vasily
>
>
> 2014-07-09 20:30 GMT+04:00 Dustin J. Mitchell <dustin at vigoro.us
> <dustin at v.igoro.us>>:
>
>> The next version of Bulidbot keeps all of its data in a database,
>> accessible via a well-documented REST API. So you could build a tool
>> to extract the desired data that way.
>>
>> The latest release still keeps a great deal of data in binary pickle
>> files, which are not easy to extract data from.
>>
>> Dustin
>>
>> On Fri, Jul 4, 2014 at 2:56 AM, Toph Bei Fong <toph_ut at yahoo.de> wrote:
>> > Hello,
>> > does anyone knows if there is any way to save the results of the builds
>> in a
>> > database so i can get them later and make a report for example. I know
>> i can
>> > to a report in Buildbot, but i would like to be more flexible.
>> > and the second question would be, if there is noway to save the results
>> in a
>> > database, can i "fake" a result for example?
>> > I mean, i have tests which take 6 hours .. (builder with many scheduled
>> > builders) .. lets say one of the builders returned a sporadic error .. i
>> > wouldnt like to do the 6h-tests just to have a report with all the
>> steps in
>> > green.
>> > In this case, i would like just to "redirect" or "fake" the builder
>> with an
>> > error with a builder without error, which i started later.
>> >
>> > greets,
>> > Toph.
>> >
>> >
>> ------------------------------------------------------------------------------
>> > Open source business process management suite built on Java and Eclipse
>> > Turn processes into business applications with Bonita BPM Community
>> Edition
>> > Quickly connect people, data, and systems into organized workflows
>> > Winner of BOSSIE, CODIE, OW2 and Gartner awards
>> > http://p.sf.net/sfu/Bonitasoft
>> > _______________________________________________
>> > Buildbot-devel mailing list
>> > Buildbot-devel at lists.sourceforge.net
>> > https://lists.sourceforge.net/lists/listinfo/buildbot-devel
>> >
>>
>>
>> ------------------------------------------------------------------------------
>> Open source business process management suite built on Java and Eclipse
>> Turn processes into business applications with Bonita BPM Community
>> Edition
>> Quickly connect people, data, and systems into organized workflows
>> Winner of BOSSIE, CODIE, OW2 and Gartner awards
>> http://p.sf.net/sfu/Bonitasoft
>> _______________________________________________
>> Buildbot-devel mailing list
>> Buildbot-devel at listssourceforge.net
>> <Buildbot-devel at lists.sourceforge.net>
>> https://lists.sourceforge.net/lists/listinfo/buildbot-devel
>>
>>
>
> ------------------------------------------------------------------------------
> Open source business process management suite built on Java and Eclipse
> Turn processes into business applications with Bonita BPM Community Edition
> Quickly connect people, data, and systems into organized workflows
> Winner of BOSSIE, CODIE, OW2 and Gartner awards
> http://p.sfnet/sfu/Bonitasoft <http://p.sf.net/sfu/Bonitasoft>
>
> _______________________________________________
> Buildbot-devel mailing list
> Buildbot-devel at lists.sourceforge.net
> https://lists.sourceforge.net/lists/listinfo/buildbot-devel
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://buildbot.net/pipermail/devel/attachments/20140806/91e64059/attachment.html>
-------------- next part --------------
import datetime
import time
import json
import urllib
import urllib2
import urlparse
#===============================================================================
class LogInfo(object):
def __init__(self, builderName, name, filename, dataConnector):
self.__name = name
self.__filename = filename
self.__builderName = builderName
self.__dataConnector = dataConnector
@property
def name(self):
return self.__name
def open(self, mode = 'r'):
"""returns file-like object of log contents"""
return self.__dataConnector.openLog(self.__builderName, self.__filename)
#===============================================================================
class BuildStepInfo(object):
def __init__(self, builderName, stepStatus, dataConnector):
self.__name, self.__startTime, self.__finishTime, self.__result, logs = stepStatus
self.__dataConnector = dataConnector
self.__logs = tuple(LogInfo(builderName, name, filename, self.__dataConnector) for name, filename in logs)
@property
def name(self):
return self.__name
@property
def startTime(self):
return self.__startTime
@property
def finishTime(self):
return self.__finishTime
@property
def logs(self):
"""returns tuple of LogInfo objects"""
return self.__logs
@property
def result(self):
return self.__result
#===============================================================================
def timestampToDatetime(ts):
return ts and datetime.datetime.fromtimestamp(ts)
def decodeBuildbotProperty(prop, default = ''):
if prop:
return json.loads(prop)[0]
else:
return default
MINIMAL_REFRESH_TIME = 60
BUILDBOT_JSON_REQUEST_TIMEOUT = 2
#===============================================================================
class BuildInfo(object):
def __init__(self, row, fields, dataConnector):
self.updateFromDb(row, fields)
self.__steps = None
self.__properties = None
self.__pickleDataLoaded = False
self.__updateFromPickleTimestamp = None
self.__updateFromJsonTimestamp = None
self.__dataConnector = dataConnector
#-------------------------------------------------------------------------------
def updateFromDb(self, row, fields):
self.__builderName = row[fields['buildername']]
self.__buildNumber = row[fields['number']]
self.__startTime = timestampToDatetime(row[fields['start_time']])
self.__finishTime = timestampToDatetime(row[fields['finish_time']])
self.__complete = row[fields['complete']]
self.__result = row[fields['results']]
self.__submittedAt = timestampToDatetime(row[fields['submitted_at']])
self.__revision = row[fields['revision']]
self.__branch = decodeBuildbotProperty(row[fields['product_branch']]).encode('ascii')
self.__completeAt = timestampToDatetime(row[fields['complete_at']])
self.__userName = decodeBuildbotProperty(row[fields['user_name']]).encode('ascii')
self.__claimedBy = row[fields['claimed_by_name']]
self.__buildRequestId = row[fields['brid']]
self.__updateFromDbTimestamp = time.time()
#-------------------------------------------------------------------------------
def loadPickleData(self):
if self.__pickleDataLoaded or (self.result is None) or \
(self.__updateFromPickleTimestamp and \
(time.time() - self.__updateFromPickleTimestamp < MINIMAL_REFRESH_TIME)):
# already loaded, build isn't complete or we're trying too frequently, no need to load
return
self.__updateFromPickleTimestamp = time.time()
properties, steps = self.__dataConnector.loadDataFromPickle(self.__builderName, self.__buildNumber)
if properties is None:
# error reading properties, invalidate them
self.__properties = None
return
self.__properties = properties
if steps:
self.__steps = tuple([BuildStepInfo(self.__builderName, stepStatus, self.__dataConnector) for stepStatus in steps])
else:
# error loading steps, invalidate them
self.__steps = None
return
# all data loaded successfully, mark it
self.__pickleDataLoaded = True
#-------------------------------------------------------------------------------
def loadJsonData(self):
if self.__updateFromJsonTimestamp and \
(time.time() - self.__updateFromJsonTimestamp < MINIMAL_REFRESH_TIME):
# we're requesting it too frequently, do nothing
return
self.__updateFromJsonTimestamp = time.time()
self.refresh()
if (not self.__startTime) or (self.__complete != 0):
# the build is either pending or finished, don't load data from json
return
targetURL = "%s/json/builders/%s/builds/%s" % \
(self.__dataConnector.getBuildbotUrl(), self.__builderName, self.__buildNumber)
targetURL = urllib.quote(targetURL, safe = ':/?=&')
try:
buildDataSock = urllib2.urlopen(targetURL, timeout = BUILDBOT_JSON_REQUEST_TIMEOUT)
buildData = json.loads(buildDataSock.read())
self.__properties = {}
for propertyEntry in buildData['properties']:
propName, propValue = propertyEntry[:2]
self.__properties[propName] = propValue
except:
# error reading properties, invalidate them
self.__properties = None
#-------------------------------------------------------------------------------
def refresh(self):
if (self.__complete != 0) or (time.time() - self.__updateFromDbTimestamp < MINIMAL_REFRESH_TIME):
return
# XXX: refresh currently seems broken... just request a new BuildInfo object each time
# you want actual data. For more see comments preceding getBuildrequestInfo() function.
return
#fields, rows = self.__dataProvider.getBuildrequestInfo(self.__buildRequestId)
#assert len(rows) == 1
#self.updateFromDb(rows[0], fields)
#-------------------------------------------------------------------------------
@property
def steps(self):
"""returns tuple of BuildStepInfo objects"""
if self.__steps is None:
self.loadPickleData()
return self.__steps if self.__steps else ()
@property
def builderName(self):
return self.__builderName
@property
def buildNumber(self):
self.refresh()
if isinstance(self.__buildNumber, int):
return self.__buildNumber
else:
return None
@property
def startTime(self):
return self.__startTime
@property
def finishTime(self):
self.refresh()
return self.__finishTime
@property
def submitTime(self):
return self.__submittedAt
@property
def result(self):
self.refresh()
if self.__complete != 0:
return self.__result
else:
return None
@property
def pending(self):
self.refresh()
return bool(self.__complete == 0 and not self.__claimedBy)
@property
def branch(self):
self.refresh()
return self.__branch
@property
def userName(self):
self.refresh()
return self.__userName
@property
def buildRequestId(self):
return self.__buildRequestId
@property
def initialRevision(self):
self.refresh()
return self.__revision
@property
def buildbotBuildPageUrl(self):
if self.buildNumber is None:
return ""
else:
relativeUrl = urllib.quote('builders/%s/builds/%s' % (self.builderName, self.buildNumber))
return urlparse.urljoin(self.__dataConnector.getBuildbotUrl(), relativeUrl)
@property
def buildbotCancelBuildUrl(self):
if self.buildNumber is not None:
return ""
relativeUrl = urllib.quote('builders/%s/cancelbuild' % self.builderName) + '?id=%s' % self.buildRequestId
return urlparse.urljoin(self.__dataConnector.getBuildbotUrl(), relativeUrl)
@property
def buildbotBuilderQueueUrl(self):
if self.buildNumber is not None:
return ""
relativeUrl = urllib.quote('builders/%s/' % self.builderName)
return urlparse.urljoin(self.__dataConnector.getBuildbotUrl(), relativeUrl)
#-------------------------------------------------------------------------------
def getActualRevision(self, updateFromProperties = False):
# Set updateFromProperties to True only if you expect that revision/got_revision
# properties of the build may change within its life; for current TCAR configuration
# this is so for trigger tasks and Phase II builds only.
if updateFromProperties:
props = self.properties
print props
return props.get('got_revision') or \
props.get('last_revision') or \
props.get('revision')
else:
return self.initialRevision
@property
def properties(self):
if self.pending:
return {}
if self.__properties is None:
self.loadPickleData()
if not self.__pickleDataLoaded:
self.loadJsonData()
return self.__properties if self.__properties else {}
#===============================================================================
def _toDbTime(t):
if t is None:
return t
if isinstance(t, float):
return str(t)
return str(time.mktime(t.timetuple()))
#-------------------------------------------------------------------------------
class BuildbotDataProvider(object):
def __init__(self, dataConnector = None, bbConfig = None):
if dataConnector:
self.__dataConnector = dataConnector
else:
from bb_local_connector import createConnector
self.__dataConnector = createConnector(bbConfig)
#-------------------------------------------------------------------------------
def getBuildsByBuilderName(self, builder = None,
timeStart = None, timeFinish = None,
running = True, pending = True, completed = True,
userName = None, revision = None):
"""return tuple of BuildInfo objects for specified builder name satisfying other
specified criteria
"""
# builder name can be SQL-formatted, e.g. '%some_name%blah' - in SQL '%' means
# 'anything', much like '.*' in regexp
fields, rows = self.__dataConnector.getBuildsByBuilderName(builder, _toDbTime(timeStart), _toDbTime(timeFinish), running, pending, completed, userName, revision)
return [BuildInfo(x, fields, self.__dataConnector) for x in rows]
-------------- next part --------------
#packages user to jsonrpc client
#pip install bunch
#pip install python-jsonrpc
import pyjsonrpc.rpclib
import StringIO
import contextlib
#===============================================================================
class BBRpcConnector(object):
def __init__(self, url, username, password):
self.__service = pyjsonrpc.HttpClient(url, username=username, password=password)
def openLog(self, builderName, fileName):
content, error = self.__service.readWholeLog(builderName, fileName)
if error:
raise IOError(error)
return contextlib.closing(StringIO.StringIO(content))
def loadDataFromPickle(self, builderName, buildNumber):
return self.__service.loadDataFromPickle(builderName, buildNumber)
def getBuildbotUrl(self):
return self.__service.getBuildbotUrl()
def getBuildsByBuilderName(self, builder = None,
timeStart = None, timeFinish = None,
running = True, pending = True, completed = True,
userName = None, revision = None):
return self.__service.getBuildsByBuilderName(builder, timeStart, timeFinish, running, pending, completed, userName, revision)
#===============================================================================
def createConnector(url, username, password):
return BBRpcConnector(url, username, password)
-------------- next part --------------
#pylint: disable=missing-docstring
import json
import cPickle as pickle
import os
import threading
from twisted.persisted.styles import Versioned as TwistedVersioned
from bz2 import BZ2File
from gzip import GzipFile
import contextlib
from buildbot.db import enginestrategy
from fast_cache import FastCache
#===============================================================================
class DB(object):
def __init__(self, dbUrl, basedir):
self.__engine = enginestrategy.create_engine(dbUrl, basedir = basedir)
def executeQuery(self, query, params = ()):
assert isinstance(params, (list, tuple)), 'params should be tuple or list'
if isinstance(params, list):
params = tuple(params)
with self.__engine.begin() as connection:
result = connection.execute(query, params)
fields = [(colinfo[0], position) for position, colinfo in \
enumerate(result._cursor_description())]
return dict(fields), result.fetchall()
#===============================================================================
def setstateStub(self, state):
self.__dict__ = state
#===============================================================================
class TwistedLeakProtector(object): #pylint: disable=too-few-public-methods
# If we un-pickle a BuildInfo object outside of Twisted environment without monkey-patching
# it a bit, it would hang in memory indefinitely as it caches itself in some global place
# and expects Twisted to clean it up later. Thus we introduce this class that monkey-patches
# some Twisted class on scope enter and un-patches it back on exit. It also locks the scope
# so we won't run into a multithreaded problem.
# NOTE: this lock is bound to object class, not object instance! so it's kind of singleton.
lock = threading.RLock()
def __init__(self, ):
self.oldSetState = TwistedVersioned.__setstate__
def __enter__(self):
# grab the lock so only one thread would be patching back and forth an object
TwistedLeakProtector.lock.acquire()
TwistedVersioned.__setstate__ = setstateStub
def __exit__(self, excType, excVal, excTb):
TwistedVersioned.__setstate__ = self.oldSetState
TwistedLeakProtector.lock.release()
return False
#===============================================================================
class BBConnector(object):
openers = {'' : open,
'.bz2': BZ2File,
'.gz' : GzipFile,
}
def __init__(self, bbConfig, maxCache=100):
self.__db = DB(bbConfig.c['db_url'], bbConfig.basedir)
self.cfg = bbConfig
self.builderDirs = {}
for builder in bbConfig.c['builders']:
self.builderDirs[builder['name']] = os.path.join(bbConfig.basedir, builder['builddir'])
self.cache = FastCache(maxCache)
def openLog(self, builderName, fileName):
if os.path.dirname(fileName) != "":
raise Exception("fileName must not contain directory part for security concerns")
filePath = os.path.join(self.builderDirs[builderName], fileName)
for (ext, opener) in self.openers.iteritems():
if os.path.exists(filePath + ext):
return contextlib.closing(opener(filePath + ext, "r"))
raise IOError("log file doesn't exist")
def __loadPickle(self, builderName, buildNumber):
try:
return self.cache[(builderName, buildNumber)]
except KeyError:
pass
with TwistedLeakProtector():
try:
with open(os.path.join(self.builderDirs[builderName], str(buildNumber)), 'rb') \
as pickleFile:
self.cache[(builderName, buildNumber)] = result = pickle.load(pickleFile)
return result
# we don't care what exception is raised - we report a failure anyway
except: #pylint: disable=bare-except
# error reading pickle, don't cache it
return None
def loadDataFromPickle(self, builderName, buildNumber):
pickleData = self.__loadPickle(builderName, buildNumber)
properties = {}
try:
for propName, propValue, _ in pickleData.properties.asList():
properties[propName] = propValue
# we don't care what exception happens as we mark properties as invalid in any case
except: #pylint: disable=bare-except
properties = None
steps = []
try:
for stepStatus in pickleData.steps:
steps.append((stepStatus.name, stepStatus.started, stepStatus.finished,
stepStatus.results,
[(log.name, log.filename) for log in stepStatus.logs]))
# we don't care what exception happens as we mark steps as invalid in any case
except: #pylint: disable=bare-except
steps = None
return (properties, steps)
def getBuildbotUrl(self):
return self.cfg.c['buildbotURL']
def getBuildsByBuilderName(self, builder=None, timeStart=None, timeFinish=None,
running=True, pending=True, completed=True, userName=None, revision=None):
queryParams = []
querySelect = """br.buildername, b.number, b.start_time, b.finish_time, br.complete,
br.results, br.submitted_at, ss.revision, branches.property_value as product_branch,
br.complete_at, obj.name as claimed_by_name, users.property_value as user_name, br.id as brid
"""
queryFrom = """buildrequests as br
LEFT JOIN buildset_properties AS branches ON branches.buildsetid = br.buildsetid AND branches.property_name = 'product_branch'
LEFT JOIN buildset_properties AS users ON users.buildsetid = br.buildsetid AND users.property_name = 'user_name'
LEFT JOIN builds AS b ON b.brid = br.id
INNER JOIN buildsets AS bs ON bs.id = br.buildsetid
INNER JOIN sourcestamps AS ss ON ss.sourcestampsetid = bs.sourcestampsetid
LEFT JOIN buildrequest_claims AS brc ON brc.brid = br.id
LEFT JOIN objects AS obj ON obj.id = brc.objectid
"""
queryWhere = '1=1 '
if builder:
if '%' in builder:
queryWhere += ' AND br.buildername LIKE ?'
else:
queryWhere += ' AND br.buildername = ?'
queryParams.append(builder)
stateFilter = []
if running:
stateFilter.append("b.finish_time IS NULL AND b.start_time IS NOT NULL AND "
"(br.complete=0 OR br.results IS NULL) AND b.number IS NOT NULL")
if pending:
stateFilter.append("b.number IS NULL AND br.complete = 0")
if completed:
stateFilter.append("br.complete != 0 AND b.number IS NOT NULL")
if not stateFilter:
return []
if len(stateFilter) < 3:
queryWhere += " AND (%s) " % " OR ".join(["(%s)" % f for f in stateFilter])
if userName:
queryWhere += " AND users.property_value LIKE ? "
queryParams.append('["%s"%%' % userName)
if timeStart is not None:
queryWhere += " AND br.submitted_at > ? "
queryParams.append(timeStart)
if timeFinish is not None:
queryWhere += " AND br.submitted_at < ? "
queryParams.append(timeFinish)
if revision is not None:
queryWhere += " AND ss.revision = ? "
queryParams.append(revision)
queryText = 'SELECT %(querySelect)s FROM %(queryFrom)s WHERE %(queryWhere)s' % locals()
fields, rows = self.__db.executeQuery(queryText, tuple(queryParams))
resultsIndex, completeIndex = fields['results'], fields['complete']
builderNameIndex, buildNumberIndex = fields['buildername'], fields['number']
def isValidBuildRow(row):
if row[builderNameIndex] not in self.builderDirs:
return False
buildObj = self.__loadPickle(row[builderNameIndex], row[buildNumberIndex])
if not buildObj:
# pickle is probably missing => build cannot be complete
if row[completeIndex] != 0:
return False
# without pickle file there's no way to distinguish between pending and running
return True
# if a pickle is there the build cannot be not complete, and results in DB should
# match those in the object
return row[completeIndex] != 0 and row[resultsIndex] == buildObj.results
return fields, [tuple(row) for row in rows if isValidBuildRow(row)]
def getAllUsers(self, timeStart=None):
query = """
SELECT bp.property_value FROM buildrequests AS br LEFT OUTER JOIN buildset_properties AS bp
ON br.buildsetid=bp.buildsetid
WHERE bp.property_name='user_name'
"""
queryParams = []
if timeStart:
query += " AND br.submitted_at > ?"
queryParams.append(timeStart)
_, rawNames = self.__db.executeQuery(query, tuple(queryParams))
names = set()
for item in rawNames:
# "item" is json representation, looks like a string
# like ["achistya", "Build (in triggering build)"]
try:
names.add(json.loads(item[0])[0].encode('ascii'))
# just silently ignore any invalid lines returned by DB query - we don't care
except: #pylint: disable=bare-except
pass
return sorted(names)
#===============================================================================
def createConnector(bbConfig):
return BBConnector(bbConfig)
-------------- next part --------------
"""
Smart cache that limits the amount of cached data to a specified value.
"""
#pylint: disable=missing-docstring
import weakref
class CacheElement(object): #pylint: disable=too-few-public-methods
'''
This is a container used to store values, so that all objects could be put into this
mapping (otherwise some objects would complain about "cannot create a weak reference
to an object", e.g. int type cannot be weak-referenced); it also is used to create
double-linked list that actually references objects.
'''
__slots__ = ['prev', 'next', 'value', '__init__', '__weakref__']
def __init__(self, value):
self.prev, self.next, self.value = None, None, value
class FastCache(object): #pylint: disable=too-few-public-methods
'''
This is a dict-like object that keeps no more than maxCount objects; each access to the
element (read or write) marks it as a fresh one, i.e. moves it to the end of "to-delete"
objects.
'''
def __init__(self, maxCount):
self.dict = weakref.WeakValueDictionary()
self.head = None
self.tail = None
self.count = 0
self.maxCount = maxCount
def _removeElement(self, element):
prevElement, nextElement = element.prev, element.next
if prevElement:
prevElement.next = nextElement
elif self.head == element:
self.head = nextElement
if nextElement:
nextElement.prev = prevElement
elif self.tail == element:
self.tail = prevElement
element.prev, element.next = None, None
self.count -= 1
def _appendElement(self, element):
if element is None:
return
element.prev, element.next = self.tail, None
if self.head is None:
self.head = element
if self.tail is not None:
self.tail.next = element
self.tail = element
self.count += 1
def get(self, key, default=None):
try:
return self[key]
except KeyError:
return default
def has_key(self, key): #pylint: disable=invalid-name
return self.dict.has_key(key)
def __contains__(self, key):
return key in self.dict
def __len__(self):
return len(self.dict)
def __getitem__(self, key):
element = self.dict[key]
# At this point the element is not None
self._removeElement(element)
self._appendElement(element)
return element.value
def __setitem__(self, key, value):
try:
element = self.dict[key]
self._removeElement(element)
except KeyError:
if self.count == self.maxCount:
self._removeElement(self.head)
element = CacheElement(value)
self._appendElement(element)
self.dict[key] = element
def __del__(self):
while self.head:
self._removeElement(self.head)
More information about the devel
mailing list