From: Michael Vogt Date: Tue, 10 Jun 2014 12:49:58 +0000 (+0200) Subject: Merge remote-tracking branch 'mvo/feature/apt-update-info' into debian/sid X-Git-Tag: 1.0.4~6 X-Git-Url: https://git.saurik.com/apt.git/commitdiff_plain/59d6e5b06c25acdd6583ea801740c36acabc19ac?hp=8f418981337503ff7abedd872f788b51bcdbc886 Merge remote-tracking branch 'mvo/feature/apt-update-info' into debian/sid --- diff --git a/.gitignore b/.gitignore index 488682ee7..a00c84a02 100644 --- a/.gitignore +++ b/.gitignore @@ -42,3 +42,9 @@ /vendor/current /vendor/*/sources.list /vendor/*/makefile.auto + +# generated for and by abicheck +/abicheck/apt_build.xml +/abicheck/apt_installed.xml +/abicheck/compat_reports/ +/abicheck/logs/ diff --git a/.travis.yml b/.travis.yml index 2d9194c28..b413134c5 100644 --- a/.travis.yml +++ b/.travis.yml @@ -1,5 +1,6 @@ language: cpp before_install: - sudo apt-get update -q - - sudo apt-get install -q --no-install-recommends dpkg-dev debhelper libdb-dev gettext libcurl4-gnutls-dev zlib1g-dev libbz2-dev xsltproc docbook-xsl docbook-xml po4a autotools-dev autoconf automake doxygen debiandoc-sgml stunnel4 + - sudo ./prepare-release travis-ci + - sudo apt-get install -q --no-install-recommends stunnel4 script: make && make test && test/integration/run-tests diff --git a/README.MultiArch b/README.MultiArch deleted file mode 100644 index 588419b8d..000000000 --- a/README.MultiArch +++ /dev/null @@ -1,63 +0,0 @@ -Before we start with this topic: Note that MultiArch is not yet ready for -prime time and/or for the casual user. The implementation is so far widely -untested and only useful for developers of packagemanagment tools which -use APT and his friends and maintainers of (upcoming) MultiArch packages. -This README is especially NOT written for the casual user and is NOT a -usage guide - you have been warned. It is assumed that the reader has -at least a bit of knowledge about APT internals, dependency relations -and the MultiArch spec [0]. - -Note also that the toolchain isn't ready yet, e.g. while you can simulate -the installation of MultiArch packages they will more sooner than later -cause enormous problems if really installed as dpkg can't handle MultiArch -yet (no, --force-{overwrite,architecture} aren't good options here). -Other parts of the big picture are missing and/or untested too. -You have been warned! - - -The implementation is focused on NOT breaking existing singleArch-only -applications and/or systems as this is the current status-quo for all -systems. Also, many systems don't need (or can't make use of) MultiArch, -so APT will proceed in thinking SingleArch as long as it is not explicitly -told to handle MultiArch: -To activate MultiArch handling you need to specify architectures you -want to be considered by APT with the config list APT::Architectures -(Insert architectures in order of preference). -APT will download Packages files for all these architectures in the -update step. Exception: In the sourcelist is the optionfield used: -deb [ arch=amd64,i386 ] http://example.org/ experimental main -(This optionfield is a NOP in previous apt versions) - -Internally in APT a package is represented as a PkgIterator - -before MultiArch this PkgIterator was architecture unaware, -only VerIterators include the architecture they came from. -This is/was a big problem as all versions in a package are -considered for dependency resolution, so pinning will not work in all cases. - -The problem is solved by a conceptional change: -A PkgIterator is now architecture aware, so the packages -of foobar for amd64 and for i386 are now for apt internal totally -different packages. That is a good thing for e.g. pinning, but -sometimes you need the information that such packages are belonging together: -All these foobar packages therefore form a Group accessible with GrpIterators. -Note that the GrpIterator has the same name as all the packages in this group, -so e.g. apt-cache pkgnames iterates over GrpIterator to get the package names: -This is compatible to SingleArch as a Group consists only of a single package -and also to MultiArch as a Group consists of possible many packages which -all have the same name and are therefore out of interest for pkgnames. - - -Given all these internal changes it is quite interesting that the actual -implementation of MultiArch is trivial: Some implicit dependencies and a few -more provides are all changes needed to get it working. Especially noteworthy -is that it wasn't needed to change the resolver in any way and other parts only -need to be told about using GrpIterator instead of PkgIterator, so chances are -good that libapt-applications will proceed to work without or at least only -require minor changes, but your mileage may vary… - - -Known Issues and/or noteworthy stuff: -* The implementation is mostly untested, so it is very likely that APT will - eat your kids if you aren't as lucky as the author of these patches. - -[0] https://wiki.ubuntu.com/MultiarchSpec diff --git a/abicheck/run_abi_test b/abicheck/run_abi_test index 8f2d7d203..18c13dfcb 100755 --- a/abicheck/run_abi_test +++ b/abicheck/run_abi_test @@ -1,5 +1,8 @@ #!/bin/sh +# ensure we are in the abibreak subdirectory +cd "$(readlink -f $(dirname $0))" + if [ ! -d ../build ]; then echo "../build missing, did you run make?" exit 1 @@ -10,7 +13,7 @@ if [ ! -x "$(which abi-compliance-checker 2>/dev/null )" ]; then exit 1 fi -LIBPATH=$(find /usr/lib/ -type f -name "libapt-*.so.*" -printf %p\\\\n) +LIBPATH=$(find /usr/lib/$(dpkg-architecture -qDEB_HOST_MULTIARCH) -type f -regex '.*/libapt-\(pkg\|inst\)\.so\..*' -printf %p\\\\n) sed s#@installed_libapt@#$LIBPATH# apt_installed.xml.in > apt_installed.xml BUILDPATH=$(readlink -f ../build) diff --git a/apt-inst/makefile b/apt-inst/makefile index da983df5f..af887bba8 100644 --- a/apt-inst/makefile +++ b/apt-inst/makefile @@ -20,15 +20,7 @@ SLIBS=$(PTHREADLIB) -lapt-pkg APT_DOMAIN:=libapt-inst$(MAJOR) LIBRARYDEPENDS=$(LIB)/libapt-pkg.so -# Source code for the contributed non-core things -SOURCE = contrib/extracttar.cc contrib/arfile.cc +SOURCE = $(wildcard *.cc */*.cc) +HEADERS = $(addprefix apt-pkg/,$(notdir $(wildcard *.h */*.h))) -# Source code for the main library -SOURCE+= filelist.cc dirstream.cc extract.cc deb/debfile.cc - -# Public header files -HEADERS = extracttar.h arfile.h filelist.h extract.h \ - dirstream.h debfile.h - -HEADERS := $(addprefix apt-pkg/,$(HEADERS)) include $(LIBRARY_H) diff --git a/apt-pkg/acquire-item.cc b/apt-pkg/acquire-item.cc index 30743addf..0178456a8 100644 --- a/apt-pkg/acquire-item.cc +++ b/apt-pkg/acquire-item.cc @@ -932,6 +932,8 @@ pkgAcqIndex::pkgAcqIndex(pkgAcquire *Owner, } CompressionExtension = comprExt; + Verify = true; + Init(URI, URIDesc, ShortDesc); } pkgAcqIndex::pkgAcqIndex(pkgAcquire *Owner, IndexTarget const *Target, diff --git a/apt-pkg/algorithms.cc b/apt-pkg/algorithms.cc index a7b676660..608ec7fce 100644 --- a/apt-pkg/algorithms.cc +++ b/apt-pkg/algorithms.cc @@ -445,19 +445,22 @@ void pkgProblemResolver::MakeScores() || (I->Flags & pkgCache::Flag::Important) == pkgCache::Flag::Important) Score += PrioEssentials; - // We transform the priority - if (Cache[I].InstVerIter(Cache)->Priority <= 5) - Score += PrioMap[Cache[I].InstVerIter(Cache)->Priority]; - + pkgCache::VerIterator const InstVer = Cache[I].InstVerIter(Cache); + // We apply priorities only to downloadable packages, all others are prio:extra + // as an obsolete prio:standard package can't be that standard anymore… + if (InstVer->Priority <= pkgCache::State::Extra && InstVer.Downloadable() == true) + Score += PrioMap[InstVer->Priority]; + else + Score += PrioMap[pkgCache::State::Extra]; + /* This helps to fix oddball problems with conflicting packages - on the same level. We enhance the score of installed packages - if those are not obsolete - */ + on the same level. We enhance the score of installed packages + if those are not obsolete */ if (I->CurrentVer != 0 && Cache[I].CandidateVer != 0 && Cache[I].CandidateVerIter(Cache).Downloadable()) Score += PrioInstalledAndNotObsolete; // propagate score points along dependencies - for (pkgCache::DepIterator D = Cache[I].InstVerIter(Cache).DependsList(); D.end() == false; ++D) + for (pkgCache::DepIterator D = InstVer.DependsList(); D.end() == false; ++D) { if (DepMap[D->Type] == 0) continue; diff --git a/apt-pkg/aptconfiguration.cc b/apt-pkg/aptconfiguration.cc index 6ba047560..9982759c6 100644 --- a/apt-pkg/aptconfiguration.cc +++ b/apt-pkg/aptconfiguration.cc @@ -476,7 +476,7 @@ const Configuration::getCompressors(bool const Cached) { std::vector const comp = _config->FindVector("APT::Compressor"); for (std::vector::const_iterator c = comp.begin(); c != comp.end(); ++c) { - if (*c == "." || *c == "gzip" || *c == "bzip2" || *c == "lzma" || *c == "xz") + if (c->empty() || *c == "." || *c == "gzip" || *c == "bzip2" || *c == "lzma" || *c == "xz") continue; compressors.push_back(Compressor(c->c_str(), std::string(".").append(*c).c_str(), c->c_str(), "-9", "-d", 100)); } diff --git a/apt-pkg/cacheset.cc b/apt-pkg/cacheset.cc index d453a2bfb..2ed6a96da 100644 --- a/apt-pkg/cacheset.cc +++ b/apt-pkg/cacheset.cc @@ -391,6 +391,8 @@ bool VersionContainerInterface::FromModifierCommandLine(unsigned short &modID, CacheSetHelper &helper) { Version select = NEWEST; std::string str = cmdline; + if (unlikely(str.empty() == true)) + return false; bool modifierPresent = false; unsigned short fallback = modID; for (std::list::const_iterator mod = mods.begin(); @@ -400,8 +402,8 @@ bool VersionContainerInterface::FromModifierCommandLine(unsigned short &modID, size_t const alength = strlen(mod->Alias); switch(mod->Pos) { case Modifier::POSTFIX: - if (str.compare(str.length() - alength, alength, - mod->Alias, 0, alength) != 0) + if (str.length() <= alength || + str.compare(str.length() - alength, alength, mod->Alias, 0, alength) != 0) continue; str.erase(str.length() - alength); modID = mod->ID; diff --git a/apt-pkg/cdrom.cc b/apt-pkg/cdrom.cc index 2635ede76..a5ad6a9ff 100644 --- a/apt-pkg/cdrom.cc +++ b/apt-pkg/cdrom.cc @@ -563,6 +563,15 @@ bool pkgCdrom::WriteSourceList(string Name,vector &List,bool Source) return true; } /*}}}*/ +bool pkgCdrom::UnmountCDROM(std::string const &CDROM, pkgCdromStatus * const log)/*{{{*/ +{ + if (_config->FindB("APT::CDROM::NoMount",false) == true) + return true; + if (log != NULL) + log->Update(_("Unmounting CD-ROM...\n"), STEP_LAST); + return UnmountCdrom(CDROM); +} + /*}}}*/ bool pkgCdrom::MountAndIdentCDROM(Configuration &Database, std::string &CDROM, std::string &ident, pkgCdromStatus * const log, bool const interactive)/*{{{*/ { // Startup @@ -583,9 +592,7 @@ bool pkgCdrom::MountAndIdentCDROM(Configuration &Database, std::string &CDROM, s { if (interactive == true) { - if(log != NULL) - log->Update(_("Unmounting CD-ROM...\n"), STEP_LAST); - UnmountCdrom(CDROM); + UnmountCDROM(CDROM, log); if(log != NULL) { @@ -605,6 +612,9 @@ bool pkgCdrom::MountAndIdentCDROM(Configuration &Database, std::string &CDROM, s return _error->Error("Failed to mount the cdrom."); } + if (IsMounted(CDROM) == false) + return _error->Error("Failed to mount the cdrom."); + // Hash the CD to get an ID if (log != NULL) log->Update(_("Identifying... "), STEP_IDENT); @@ -614,6 +624,7 @@ bool pkgCdrom::MountAndIdentCDROM(Configuration &Database, std::string &CDROM, s ident = ""; if (log != NULL) log->Update("\n"); + UnmountCDROM(CDROM, NULL); return false; } @@ -629,8 +640,11 @@ bool pkgCdrom::MountAndIdentCDROM(Configuration &Database, std::string &CDROM, s if (FileExists(DFile) == true) { if (ReadConfigFile(Database,DFile) == false) + { + UnmountCDROM(CDROM, NULL); return _error->Error("Unable to read the cdrom database %s", DFile.c_str()); + } } return true; } @@ -651,13 +665,7 @@ bool pkgCdrom::Ident(string &ident, pkgCdromStatus *log) /*{{{*/ } // Unmount and finish - if (_config->FindB("APT::CDROM::NoMount",false) == false) - { - if (log != NULL) - log->Update(_("Unmounting CD-ROM...\n"), STEP_LAST); - UnmountCdrom(CDROM); - } - + UnmountCDROM(CDROM, log); return true; } /*}}}*/ @@ -682,11 +690,15 @@ bool pkgCdrom::Add(pkgCdromStatus *log) /*{{{*/ { if (log != NULL) log->Update("\n"); + UnmountCDROM(CDROM, NULL); return false; } if (chdir(StartDir.c_str()) != 0) + { + UnmountCDROM(CDROM, NULL); return _error->Errno("chdir","Unable to change to %s", StartDir.c_str()); + } if (_config->FindB("Debug::aptcdrom",false) == true) { @@ -728,8 +740,7 @@ bool pkgCdrom::Add(pkgCdromStatus *log) /*{{{*/ if (List.empty() == true && SourceList.empty() == true) { - if (_config->FindB("APT::CDROM::NoMount",false) == false) - UnmountCdrom(CDROM); + UnmountCDROM(CDROM, NULL); return _error->Error(_("Unable to locate any package files, perhaps this is not a Debian Disc or the wrong architecture?")); } @@ -769,14 +780,14 @@ bool pkgCdrom::Add(pkgCdromStatus *log) /*{{{*/ { if(log == NULL) { - if (_config->FindB("APT::CDROM::NoMount",false) == false) - UnmountCdrom(CDROM); + UnmountCDROM(CDROM, NULL); return _error->Error("No disc name found and no way to ask for it"); } while(true) { if(!log->AskCdromName(Name)) { // user canceld + UnmountCDROM(CDROM, NULL); return false; } cout << "Name: '" << Name << "'" << endl; @@ -813,7 +824,10 @@ bool pkgCdrom::Add(pkgCdromStatus *log) /*{{{*/ string const partialListDir = listDir + "partial/"; if (CreateAPTDirectoryIfNeeded(_config->FindDir("Dir::State"), partialListDir) == false && CreateAPTDirectoryIfNeeded(listDir, partialListDir) == false) + { + UnmountCDROM(CDROM, NULL); return _error->Errno("cdrom", _("List directory %spartial is missing."), listDir.c_str()); + } // take care of the signatures and copy them if they are ok // (we do this before PackageCopy as it modifies "List" and "SourceList") @@ -827,7 +841,10 @@ bool pkgCdrom::Add(pkgCdromStatus *log) /*{{{*/ if (Copy.CopyPackages(CDROM,Name,List, log) == false || SrcCopy.CopyPackages(CDROM,Name,SourceList, log) == false || TransCopy.CopyTranslations(CDROM,Name,TransList, log) == false) + { + UnmountCDROM(CDROM, NULL); return false; + } // reduce the List so that it takes less space in sources.list ReduceSourcelist(CDROM,List); @@ -837,13 +854,19 @@ bool pkgCdrom::Add(pkgCdromStatus *log) /*{{{*/ if (_config->FindB("APT::cdrom::NoAct",false) == false) { if (WriteDatabase(Database) == false) + { + UnmountCDROM(CDROM, NULL); return false; - + } + if(log != NULL) log->Update(_("Writing new source list\n"), STEP_WRITE); if (WriteSourceList(Name,List,false) == false || WriteSourceList(Name,SourceList,true) == false) + { + UnmountCDROM(CDROM, NULL); return false; + } } // Print the sourcelist entries @@ -855,8 +878,7 @@ bool pkgCdrom::Add(pkgCdromStatus *log) /*{{{*/ string::size_type Space = (*I).find(' '); if (Space == string::npos) { - if (_config->FindB("APT::CDROM::NoMount",false) == false) - UnmountCdrom(CDROM); + UnmountCDROM(CDROM, NULL); return _error->Error("Internal error"); } @@ -874,8 +896,7 @@ bool pkgCdrom::Add(pkgCdromStatus *log) /*{{{*/ string::size_type Space = (*I).find(' '); if (Space == string::npos) { - if (_config->FindB("APT::CDROM::NoMount",false) == false) - UnmountCdrom(CDROM); + UnmountCDROM(CDROM, NULL); return _error->Error("Internal error"); } @@ -888,12 +909,7 @@ bool pkgCdrom::Add(pkgCdromStatus *log) /*{{{*/ } // Unmount and finish - if (_config->FindB("APT::CDROM::NoMount",false) == false) { - if (log != NULL) - log->Update(_("Unmounting CD-ROM...\n"), STEP_LAST); - UnmountCdrom(CDROM); - } - + UnmountCDROM(CDROM, log); return true; } /*}}}*/ diff --git a/apt-pkg/cdrom.h b/apt-pkg/cdrom.h index 0f2c2cd02..bd0902176 100644 --- a/apt-pkg/cdrom.h +++ b/apt-pkg/cdrom.h @@ -77,6 +77,7 @@ class pkgCdrom /*{{{*/ private: APT_HIDDEN bool MountAndIdentCDROM(Configuration &Database, std::string &CDROM, std::string &ident, pkgCdromStatus * const log, bool const interactive); + APT_HIDDEN bool UnmountCDROM(std::string const &CDROM, pkgCdromStatus * const log); }; /*}}}*/ diff --git a/apt-pkg/contrib/fileutl.cc b/apt-pkg/contrib/fileutl.cc index 188bb87ee..1ba4674e5 100644 --- a/apt-pkg/contrib/fileutl.cc +++ b/apt-pkg/contrib/fileutl.cc @@ -58,13 +58,10 @@ #include #endif #ifdef HAVE_LZMA - #include #include #endif - -#ifdef WORDS_BIGENDIAN -#include -#endif +#include +#include #include /*}}}*/ @@ -958,10 +955,10 @@ class FileFdPrivate { /*{{{*/ // FileFd::Open - Open a file /*{{{*/ // --------------------------------------------------------------------- /* The most commonly used open mode combinations are given with Mode */ -bool FileFd::Open(string FileName,unsigned int const Mode,CompressMode Compress, unsigned long const Perms) +bool FileFd::Open(string FileName,unsigned int const Mode,CompressMode Compress, unsigned long const AccessMode) { if (Mode == ReadOnlyGzip) - return Open(FileName, ReadOnly, Gzip, Perms); + return Open(FileName, ReadOnly, Gzip, AccessMode); if (Compress == Auto && (Mode & WriteOnly) == WriteOnly) return FileFdError("Autodetection on %s only works in ReadOnly openmode!", FileName.c_str()); @@ -1028,9 +1025,9 @@ bool FileFd::Open(string FileName,unsigned int const Mode,CompressMode Compress, if (compressor == compressors.end()) return FileFdError("Can't find a match for specified compressor mode for file %s", FileName.c_str()); - return Open(FileName, Mode, *compressor, Perms); + return Open(FileName, Mode, *compressor, AccessMode); } -bool FileFd::Open(string FileName,unsigned int const Mode,APT::Configuration::Compressor const &compressor, unsigned long const Perms) +bool FileFd::Open(string FileName,unsigned int const Mode,APT::Configuration::Compressor const &compressor, unsigned long const AccessMode) { Close(); Flags = AutoClose; @@ -1080,11 +1077,18 @@ bool FileFd::Open(string FileName,unsigned int const Mode,APT::Configuration::Co TemporaryFileName = string(name); free(name); - if(Perms != 600 && fchmod(iFd, Perms) == -1) + // umask() will always set the umask and return the previous value, so + // we first set the umask and then reset it to the old value + mode_t const CurrentUmask = umask(0); + umask(CurrentUmask); + // calculate the actual file permissions (just like open/creat) + mode_t const FilePermissions = (AccessMode & ~CurrentUmask); + + if(fchmod(iFd, FilePermissions) == -1) return FileFdErrno("fchmod", "Could not change permissions for temporary file %s", TemporaryFileName.c_str()); } else - iFd = open(FileName.c_str(), fileflags, Perms); + iFd = open(FileName.c_str(), fileflags, AccessMode); this->FileName = FileName; if (iFd == -1 || OpenInternDescriptor(Mode, compressor) == false) @@ -1237,7 +1241,8 @@ bool FileFd::OpenInternDescriptor(unsigned int const Mode, APT::Configuration::C if (d->lzma == NULL) d->lzma = new FileFdPrivate::LZMAFILE; d->lzma->file = (FILE*) compress_struct; - d->lzma->stream = LZMA_STREAM_INIT; + lzma_stream tmp_stream = LZMA_STREAM_INIT; + d->lzma->stream = tmp_stream; if ((Mode & ReadWrite) == ReadWrite) return FileFdError("ReadWrite mode is not supported for file %s", FileName.c_str()); @@ -1353,7 +1358,10 @@ bool FileFd::OpenInternDescriptor(unsigned int const Mode, APT::Configuration::C Args.push_back(a->c_str()); if (Comp == false && FileName.empty() == false) { - Args.push_back("--stdout"); + // commands not needing arguments, do not need to be told about using standard output + // in reality, only testcases with tools like cat, rev, rot13, … are able to trigger this + if (compressor.CompressArgs.empty() == false && compressor.UncompressArgs.empty() == false) + Args.push_back("--stdout"); if (TemporaryFileName.empty() == false) Args.push_back(TemporaryFileName.c_str()); else @@ -1646,6 +1654,8 @@ bool FileFd::Write(int Fd, const void *From, unsigned long long Size) /* */ bool FileFd::Seek(unsigned long long To) { + Flags &= ~HitEof; + if (d != NULL && (d->pipe == true || d->InternalStream() == true)) { // Our poor man seeking in pipes is costly, so try to avoid it @@ -1705,7 +1715,6 @@ bool FileFd::Skip(unsigned long long Over) { if (d != NULL && (d->pipe == true || d->InternalStream() == true)) { - d->seekpos += Over; char buffer[1024]; while (Over != 0) { @@ -1789,7 +1798,8 @@ static bool StatFileFd(char const * const msg, int const iFd, std::string const // higher-level code will generate more meaningful messages, // even translated this would be meaningless for users return _error->Errno("fstat", "Unable to determine %s for fd %i", msg, iFd); - ispipe = S_ISFIFO(Buf.st_mode); + if (FileName.empty() == false) + ispipe = S_ISFIFO(Buf.st_mode); } // for compressor pipes st_size is undefined and at 'best' zero @@ -1869,19 +1879,13 @@ unsigned long long FileFd::Size() FileFdErrno("lseek","Unable to seek to end of gzipped file"); return 0; } - size = 0; + uint32_t size = 0; if (read(iFd, &size, 4) != 4) { FileFdErrno("read","Unable to read original size of gzipped file"); return 0; } - -#ifdef WORDS_BIGENDIAN - uint32_t tmp_size = size; - uint8_t const * const p = (uint8_t const * const) &tmp_size; - tmp_size = (p[3] << 24) | (p[2] << 16) | (p[1] << 8) | p[0]; - size = tmp_size; -#endif + size = le32toh(size); if (lseek(iFd, oldPos, SEEK_SET) < 0) { diff --git a/apt-pkg/contrib/fileutl.h b/apt-pkg/contrib/fileutl.h index f25ed3622..cc1a98eae 100644 --- a/apt-pkg/contrib/fileutl.h +++ b/apt-pkg/contrib/fileutl.h @@ -103,10 +103,10 @@ class FileFd return T; } - bool Open(std::string FileName,unsigned int const Mode,CompressMode Compress,unsigned long const Perms = 0666); - bool Open(std::string FileName,unsigned int const Mode,APT::Configuration::Compressor const &compressor,unsigned long const Perms = 0666); - inline bool Open(std::string const &FileName,unsigned int const Mode, unsigned long const Perms = 0666) { - return Open(FileName, Mode, None, Perms); + bool Open(std::string FileName,unsigned int const Mode,CompressMode Compress,unsigned long const AccessMode = 0666); + bool Open(std::string FileName,unsigned int const Mode,APT::Configuration::Compressor const &compressor,unsigned long const AccessMode = 0666); + inline bool Open(std::string const &FileName,unsigned int const Mode, unsigned long const AccessMode = 0666) { + return Open(FileName, Mode, None, AccessMode); }; bool OpenDescriptor(int Fd, unsigned int const Mode, CompressMode Compress, bool AutoClose=false); bool OpenDescriptor(int Fd, unsigned int const Mode, APT::Configuration::Compressor const &compressor, bool AutoClose=false); @@ -129,13 +129,13 @@ class FileFd inline bool IsCompressed() {return (Flags & Compressed) == Compressed;}; inline std::string &Name() {return FileName;}; - FileFd(std::string FileName,unsigned int const Mode,unsigned long Perms = 0666) : iFd(-1), Flags(0), d(NULL) + FileFd(std::string FileName,unsigned int const Mode,unsigned long AccessMode = 0666) : iFd(-1), Flags(0), d(NULL) { - Open(FileName,Mode, None, Perms); + Open(FileName,Mode, None, AccessMode); }; - FileFd(std::string FileName,unsigned int const Mode, CompressMode Compress, unsigned long Perms = 0666) : iFd(-1), Flags(0), d(NULL) + FileFd(std::string FileName,unsigned int const Mode, CompressMode Compress, unsigned long AccessMode = 0666) : iFd(-1), Flags(0), d(NULL) { - Open(FileName,Mode, Compress, Perms); + Open(FileName,Mode, Compress, AccessMode); }; FileFd() : iFd(-1), Flags(AutoClose), d(NULL) {}; FileFd(int const Fd, unsigned int const Mode = ReadWrite, CompressMode Compress = None) : iFd(-1), Flags(0), d(NULL) diff --git a/apt-pkg/contrib/hashes.cc b/apt-pkg/contrib/hashes.cc index 1fce0d75f..15f83615d 100644 --- a/apt-pkg/contrib/hashes.cc +++ b/apt-pkg/contrib/hashes.cc @@ -133,7 +133,7 @@ bool Hashes::AddFD(int const Fd,unsigned long long Size, bool const addMD5, bool const addSHA1, bool const addSHA256, bool const addSHA512) { unsigned char Buf[64*64]; - bool const ToEOF = (Size == 0); + bool const ToEOF = (Size == UntilEOF); while (Size != 0 || ToEOF) { unsigned long long n = sizeof(Buf); diff --git a/apt-pkg/contrib/hashes.h b/apt-pkg/contrib/hashes.h index 5cd1af03b..7a62f8a8f 100644 --- a/apt-pkg/contrib/hashes.h +++ b/apt-pkg/contrib/hashes.h @@ -78,6 +78,8 @@ class Hashes SHA256Summation SHA256; SHA512Summation SHA512; + static const int UntilEOF = 0; + inline bool Add(const unsigned char *Data,unsigned long long Size) { return MD5.Add(Data,Size) && SHA1.Add(Data,Size) && SHA256.Add(Data,Size) && SHA512.Add(Data,Size); diff --git a/apt-pkg/deb/debindexfile.cc b/apt-pkg/deb/debindexfile.cc index eee758b7a..a0dd15cd8 100644 --- a/apt-pkg/deb/debindexfile.cc +++ b/apt-pkg/deb/debindexfile.cc @@ -525,7 +525,7 @@ bool debTranslationsIndex::Merge(pkgCacheGenerator &Gen,OpProgress *Prog) const if (FileExists(TranslationFile)) { FileFd Trans(TranslationFile,FileFd::ReadOnly, FileFd::Extension); - debListParser TransParser(&Trans); + debTranslationsParser TransParser(&Trans); if (_error->PendingError() == true) return false; diff --git a/apt-pkg/deb/deblistparser.h b/apt-pkg/deb/deblistparser.h index baace79fe..3b6963211 100644 --- a/apt-pkg/deb/deblistparser.h +++ b/apt-pkg/deb/deblistparser.h @@ -106,4 +106,15 @@ class debListParser : public pkgCacheGenerator::ListParser APT_HIDDEN unsigned char ParseMultiArch(bool const showErrors); }; +class debTranslationsParser : public debListParser +{ + public: + // a translation can never be a real package + virtual std::string Architecture() { return ""; } + virtual std::string Version() { return ""; } + + debTranslationsParser(FileFd *File, std::string const &Arch = "") + : debListParser(File, Arch) {}; +}; + #endif diff --git a/apt-pkg/deb/debsrcrecords.cc b/apt-pkg/deb/debsrcrecords.cc index b09588dd3..a444cbe4d 100644 --- a/apt-pkg/deb/debsrcrecords.cc +++ b/apt-pkg/deb/debsrcrecords.cc @@ -186,6 +186,7 @@ bool debSrcRecordParser::Files(std::vector &List) /* */ debSrcRecordParser::~debSrcRecordParser() { - delete[] Buffer; + // was allocated via strndup() + free(Buffer); } /*}}}*/ diff --git a/apt-pkg/deb/dpkgpm.cc b/apt-pkg/deb/dpkgpm.cc index c3e7e1d1d..613a4de9f 100644 --- a/apt-pkg/deb/dpkgpm.cc +++ b/apt-pkg/deb/dpkgpm.cc @@ -1053,14 +1053,15 @@ void pkgDPkgPM::StartPtyMagic() } // setup the pty and stuff - struct winsize win; + struct winsize win; - // if tcgetattr does not return zero there was a error - // and we do not do any pty magic + // if tcgetattr for both stdin/stdout returns 0 (no error) + // we do the pty magic _error->PushToStack(); - if (tcgetattr(STDOUT_FILENO, &d->tt) == 0) + if (tcgetattr(STDIN_FILENO, &d->tt) == 0 && + tcgetattr(STDOUT_FILENO, &d->tt) == 0) { - if (ioctl(1, TIOCGWINSZ, (char *)&win) < 0) + if (ioctl(STDOUT_FILENO, TIOCGWINSZ, (char *)&win) < 0) { _error->Errno("ioctl", _("ioctl(TIOCGWINSZ) failed")); } else if (openpty(&d->master, &d->slave, NULL, &d->tt, &win) < 0) @@ -1437,7 +1438,8 @@ bool pkgDPkgPM::GoNoABIBreak(APT::Progress::PackageManager *progress) if (_config->FindB("DPkg::FlushSTDIN",true) == true && isatty(STDIN_FILENO)) { - int Flags,dummy; + int Flags; + int dummy = 0; if ((Flags = fcntl(STDIN_FILENO,F_GETFL,dummy)) < 0) _exit(100); @@ -1617,7 +1619,7 @@ void pkgDPkgPM::WriteApportReport(const char *pkgpath, const char *errormsg) string::size_type pos; FILE *report; - if (_config->FindB("Dpkg::ApportFailureReport", false) == false) + if (_config->FindB("Dpkg::ApportFailureReport", true) == false) { std::clog << "configured to not write apport reports" << std::endl; return; diff --git a/apt-pkg/depcache.cc b/apt-pkg/depcache.cc index 19a6e0d7e..c25672d1c 100644 --- a/apt-pkg/depcache.cc +++ b/apt-pkg/depcache.cc @@ -1374,7 +1374,7 @@ bool pkgDepCache::IsInstallOkDependenciesSatisfiableByCandidates(PkgIterator con // the dependency is critical, but can't be installed, so discard the candidate // as the problemresolver will trip over it otherwise trying to install it (#735967) - if (Pkg->CurrentVer != 0) + if (Pkg->CurrentVer != 0 && (PkgState[Pkg->ID].iFlags & Protected) != Protected) SetCandidateVersion(Pkg.CurrentVer()); return false; } @@ -1678,7 +1678,7 @@ pkgCache::VerIterator pkgDepCache::Policy::GetCandidateVer(PkgIterator const &Pk { /* Not source/not automatic versions cannot be a candidate version unless they are already installed */ - VerIterator Last(*(pkgCache *)this,0); + VerIterator Last; for (VerIterator I = Pkg.VersionList(); I.end() == false; ++I) { diff --git a/apt-pkg/edsp.cc b/apt-pkg/edsp.cc index ee42267bc..6d1b68c23 100644 --- a/apt-pkg/edsp.cc +++ b/apt-pkg/edsp.cc @@ -18,6 +18,7 @@ #include #include #include +#include #include #include @@ -87,7 +88,12 @@ bool EDSP::WriteLimitedScenario(pkgDepCache &Cache, FILE* output, void EDSP::WriteScenarioVersion(pkgDepCache &Cache, FILE* output, pkgCache::PkgIterator const &Pkg, pkgCache::VerIterator const &Ver) { + pkgRecords Recs(Cache); + pkgRecords::Parser &rec = Recs.Lookup(Ver.FileList()); + string srcpkg = rec.SourcePkg().empty() ? Pkg.Name() : rec.SourcePkg(); + fprintf(output, "Package: %s\n", Pkg.Name()); + fprintf(output, "Source: %s\n", srcpkg.c_str()); fprintf(output, "Architecture: %s\n", Ver.Arch()); fprintf(output, "Version: %s\n", Ver.VerStr()); if (Pkg.CurrentVer() == Ver) @@ -107,10 +113,22 @@ void EDSP::WriteScenarioVersion(pkgDepCache &Cache, FILE* output, pkgCache::PkgI else if ((Ver->MultiArch & pkgCache::Version::Same) == pkgCache::Version::Same) fprintf(output, "Multi-Arch: same\n"); signed short Pin = std::numeric_limits::min(); - for (pkgCache::VerFileIterator File = Ver.FileList(); File.end() == false; ++File) { - signed short const p = Cache.GetPolicy().GetPriority(File.File()); + std::set Releases; + for (pkgCache::VerFileIterator I = Ver.FileList(); I.end() == false; ++I) { + pkgCache::PkgFileIterator File = I.File(); + signed short const p = Cache.GetPolicy().GetPriority(File); if (Pin < p) Pin = p; + if ((File->Flags & pkgCache::Flag::NotSource) != pkgCache::Flag::NotSource) { + string Release = File.RelStr(); + if (!Release.empty()) + Releases.insert(Release); + } + } + if (!Releases.empty()) { + fprintf(output, "APT-Release:\n"); + for (std::set::iterator R = Releases.begin(); R != Releases.end(); ++R) + fprintf(output, " %s\n", R->c_str()); } fprintf(output, "APT-Pin: %d\n", Pin); if (Cache.GetCandidateVer(Pkg) == Ver) @@ -231,7 +249,16 @@ bool EDSP::WriteRequest(pkgDepCache &Cache, FILE* output, bool const Upgrade, continue; req->append(" ").append(Pkg.FullName()); } - fprintf(output, "Request: EDSP 0.4\n"); + fprintf(output, "Request: EDSP 0.5\n"); + + const char *arch = _config->Find("APT::Architecture").c_str(); + std::vector archs = APT::Configuration::getArchitectures(); + fprintf(output, "Architecture: %s\n", arch); + fprintf(output, "Architectures:"); + for (std::vector::const_iterator a = archs.begin(); a != archs.end(); ++a) + fprintf(output, " %s", a->c_str()); + fprintf(output, "\n"); + if (del.empty() == false) fprintf(output, "Remove: %s\n", del.c_str()+1); if (inst.empty() == false) @@ -411,6 +438,13 @@ bool EDSP::ReadRequest(int const input, std::list &install, distUpgrade = EDSP::StringToBool(line.c_str() + 14, false); else if (line.compare(0, 11, "Autoremove:") == 0) autoRemove = EDSP::StringToBool(line.c_str() + 12, false); + else if (line.compare(0, 13, "Architecture:") == 0) + _config->Set("APT::Architecture", line.c_str() + 14); + else if (line.compare(0, 14, "Architectures:") == 0) + { + std::string const archs = line.c_str() + 15; + _config->Set("APT::Architectures", SubstVar(archs, " ", ",")); + } else _error->Warning("Unknown line in EDSP Request stanza: %s", line.c_str()); @@ -508,7 +542,7 @@ bool EDSP::WriteError(char const * const uuid, std::string const &message, FILE* } /*}}}*/ // EDSP::ExecuteSolver - fork requested solver and setup ipc pipes {{{*/ -bool EDSP::ExecuteSolver(const char* const solver, int *solver_in, int *solver_out) { +pid_t EDSP::ExecuteSolver(const char* const solver, int * const solver_in, int * const solver_out, bool) { std::vector const solverDirs = _config->FindVector("Dir::Bin::Solvers"); std::string file; for (std::vector::const_iterator dir = solverDirs.begin(); @@ -520,10 +554,16 @@ bool EDSP::ExecuteSolver(const char* const solver, int *solver_in, int *solver_o } if (file.empty() == true) - return _error->Error("Can't call external solver '%s' as it is not in a configured directory!", solver); + { + _error->Error("Can't call external solver '%s' as it is not in a configured directory!", solver); + return 0; + } int external[4] = {-1, -1, -1, -1}; if (pipe(external) != 0 || pipe(external + 2) != 0) - return _error->Errno("Resolve", "Can't create needed IPC pipes for EDSP"); + { + _error->Errno("Resolve", "Can't create needed IPC pipes for EDSP"); + return 0; + } for (int i = 0; i < 4; ++i) SetCloseExec(external[i], true); @@ -540,11 +580,19 @@ bool EDSP::ExecuteSolver(const char* const solver, int *solver_in, int *solver_o close(external[3]); if (WaitFd(external[1], true, 5) == false) - return _error->Errno("Resolve", "Timed out while Waiting on availability of solver stdin"); + { + _error->Errno("Resolve", "Timed out while Waiting on availability of solver stdin"); + return 0; + } *solver_in = external[1]; *solver_out = external[2]; - return true; + return Solver; +} +bool EDSP::ExecuteSolver(const char* const solver, int *solver_in, int *solver_out) { + if (ExecuteSolver(solver, solver_in, solver_out, true) == 0) + return false; + return true; } /*}}}*/ // EDSP::ResolveExternal - resolve problems by asking external for help {{{*/ @@ -552,7 +600,8 @@ bool EDSP::ResolveExternal(const char* const solver, pkgDepCache &Cache, bool const upgrade, bool const distUpgrade, bool const autoRemove, OpProgress *Progress) { int solver_in, solver_out; - if (EDSP::ExecuteSolver(solver, &solver_in, &solver_out) == false) + pid_t const solver_pid = EDSP::ExecuteSolver(solver, &solver_in, &solver_out, true); + if (solver_pid == 0) return false; FILE* output = fdopen(solver_in, "w"); @@ -572,6 +621,6 @@ bool EDSP::ResolveExternal(const char* const solver, pkgDepCache &Cache, if (EDSP::ReadResponse(solver_out, Cache, Progress) == false) return false; - return true; + return ExecWait(solver_pid, solver); } /*}}}*/ diff --git a/apt-pkg/edsp.h b/apt-pkg/edsp.h index f3092d3c6..9e833556a 100644 --- a/apt-pkg/edsp.h +++ b/apt-pkg/edsp.h @@ -205,10 +205,10 @@ public: * \param[out] solver_in will be the stdin of the solver * \param[out] solver_out will be the stdout of the solver * - * \return true if the solver could be started and the pipes - * are set up correctly, otherwise false and the pipes are invalid + * \return PID of the started solver or 0 if failure occurred */ - bool static ExecuteSolver(const char* const solver, int *solver_in, int *solver_out); + pid_t static ExecuteSolver(const char* const solver, int * const solver_in, int * const solver_out, bool /*overload*/); + APT_DEPRECATED bool static ExecuteSolver(const char* const solver, int *solver_in, int *solver_out); /** \brief call an external resolver to handle the request * diff --git a/apt-pkg/init.cc b/apt-pkg/init.cc index 3a35f852e..241628632 100644 --- a/apt-pkg/init.cc +++ b/apt-pkg/init.cc @@ -86,6 +86,7 @@ bool pkgInitConfig(Configuration &Cnf) Cnf.Set("Dir::Ignore-Files-Silently::", "\\.dpkg-[a-z]+$"); Cnf.Set("Dir::Ignore-Files-Silently::", "\\.save$"); Cnf.Set("Dir::Ignore-Files-Silently::", "\\.orig$"); + Cnf.Set("Dir::Ignore-Files-Silently::", "\\.distUpgrade$"); // Default cdrom mount point Cnf.CndSet("Acquire::cdrom::mount", "/media/cdrom/"); diff --git a/apt-pkg/install-progress.cc b/apt-pkg/install-progress.cc index 8bb587f67..cf6c85912 100644 --- a/apt-pkg/install-progress.cc +++ b/apt-pkg/install-progress.cc @@ -256,14 +256,14 @@ PackageManagerFancy::TermSize PackageManagerFancy::GetTerminalSize() { struct winsize win; - PackageManagerFancy::TermSize s; + PackageManagerFancy::TermSize s = { 0, 0 }; // FIXME: get from "child_pty" instead? if(ioctl(STDOUT_FILENO, TIOCGWINSZ, (char *)&win) != 0) return s; if(_config->FindB("Debug::InstallProgress::Fancy", false) == true) - std::cerr << "GetTerminalSize: " << win.ws_row << std::endl; + std::cerr << "GetTerminalSize: " << win.ws_row << " x " << win.ws_col << std::endl; s.rows = win.ws_row; s.columns = win.ws_col; @@ -275,6 +275,9 @@ void PackageManagerFancy::SetupTerminalScrollArea(int nr_rows) if(_config->FindB("Debug::InstallProgress::Fancy", false) == true) std::cerr << "SetupTerminalScrollArea: " << nr_rows << std::endl; + if (unlikely(nr_rows <= 1)) + return; + // scroll down a bit to avoid visual glitch when the screen // area shrinks by one row std::cout << "\n"; @@ -296,28 +299,30 @@ void PackageManagerFancy::SetupTerminalScrollArea(int nr_rows) // setup tty size to ensure xterm/linux console are working properly too // see bug #731738 struct winsize win; - ioctl(child_pty, TIOCGWINSZ, (char *)&win); - win.ws_row = nr_rows - 1; - ioctl(child_pty, TIOCSWINSZ, (char *)&win); + if (ioctl(child_pty, TIOCGWINSZ, (char *)&win) != -1) + { + win.ws_row = nr_rows - 1; + ioctl(child_pty, TIOCSWINSZ, (char *)&win); + } } void PackageManagerFancy::HandleSIGWINCH(int) { - int nr_terminal_rows = GetTerminalSize().rows; + int const nr_terminal_rows = GetTerminalSize().rows; SetupTerminalScrollArea(nr_terminal_rows); + DrawStatusLine(); } void PackageManagerFancy::Start(int a_child_pty) { child_pty = a_child_pty; - int nr_terminal_rows = GetTerminalSize().rows; - if (nr_terminal_rows > 0) - SetupTerminalScrollArea(nr_terminal_rows); + int const nr_terminal_rows = GetTerminalSize().rows; + SetupTerminalScrollArea(nr_terminal_rows); } void PackageManagerFancy::Stop() { - int nr_terminal_rows = GetTerminalSize().rows; + int const nr_terminal_rows = GetTerminalSize().rows; if (nr_terminal_rows > 0) { SetupTerminalScrollArea(nr_terminal_rows + 1); @@ -358,7 +363,13 @@ bool PackageManagerFancy::StatusChanged(std::string PackageName, HumanReadableAction)) return false; - PackageManagerFancy::TermSize size = GetTerminalSize(); + return DrawStatusLine(); +} +bool PackageManagerFancy::DrawStatusLine() +{ + PackageManagerFancy::TermSize const size = GetTerminalSize(); + if (unlikely(size.rows < 1 || size.columns < 1)) + return false; static std::string save_cursor = "\033[s"; static std::string restore_cursor = "\033[u"; @@ -388,7 +399,7 @@ bool PackageManagerFancy::StatusChanged(std::string PackageName, { int padding = 4; float progressbar_size = size.columns - padding - progress_str.size(); - float current_percent = (float)StepsDone/(float)TotalSteps; + float current_percent = percentage / 100.0; std::cout << " " << GetTextProgressStr(current_percent, progressbar_size) << " "; diff --git a/apt-pkg/install-progress.h b/apt-pkg/install-progress.h index 112b034fb..5d1a20e9b 100644 --- a/apt-pkg/install-progress.h +++ b/apt-pkg/install-progress.h @@ -1,6 +1,8 @@ #ifndef PKGLIB_IPROGRESS_H #define PKGLIB_IPROGRESS_H +#include + #include #include #include @@ -120,6 +122,7 @@ namespace Progress { private: static void staticSIGWINCH(int); static std::vector instances; + APT_HIDDEN bool DrawStatusLine(); protected: void SetupTerminalScrollArea(int nr_rows); diff --git a/apt-pkg/makefile b/apt-pkg/makefile index 1d456873b..5603b51ed 100644 --- a/apt-pkg/makefile +++ b/apt-pkg/makefile @@ -11,6 +11,7 @@ include ../buildlib/defaults.mak # The library name and version (indirectly used from init.h) include ../buildlib/libversion.mak + LIBRARY=apt-pkg MAJOR=$(LIBAPTPKG_MAJOR) MINOR=$(LIBAPTPKG_RELEASE) @@ -26,50 +27,7 @@ SLIBS+= -llzma endif APT_DOMAIN:=libapt-pkg$(LIBAPTPKG_MAJOR) -# Source code for the contributed non-core things -SOURCE = contrib/mmap.cc contrib/error.cc contrib/strutl.cc \ - contrib/configuration.cc contrib/progress.cc contrib/cmndline.cc \ - contrib/hashsum.cc contrib/md5.cc contrib/sha1.cc \ - contrib/sha2_internal.cc contrib/hashes.cc \ - contrib/cdromutl.cc contrib/crc-16.cc contrib/netrc.cc \ - contrib/fileutl.cc contrib/gpgv.cc -HEADERS = mmap.h error.h configuration.h fileutl.h cmndline.h netrc.h \ - md5.h crc-16.h cdromutl.h strutl.h sptr.h sha1.h sha2.h sha256.h \ - sha2_internal.h hashes.h hashsum_template.h \ - macros.h weakptr.h gpgv.h - -# Source code for the core main library -SOURCE+= pkgcache.cc version.cc depcache.cc \ - orderlist.cc tagfile.cc sourcelist.cc packagemanager.cc \ - pkgrecords.cc algorithms.cc acquire.cc\ - acquire-worker.cc acquire-method.cc init.cc clean.cc \ - srcrecords.cc cachefile.cc versionmatch.cc policy.cc \ - pkgsystem.cc indexfile.cc pkgcachegen.cc acquire-item.cc \ - indexrecords.cc vendor.cc vendorlist.cc cdrom.cc indexcopy.cc \ - aptconfiguration.cc cachefilter.cc cacheset.cc edsp.cc \ - install-progress.cc upgrade.cc update.cc -HEADERS+= algorithms.h depcache.h pkgcachegen.h cacheiterators.h \ - orderlist.h sourcelist.h packagemanager.h tagfile.h \ - init.h pkgcache.h version.h progress.h pkgrecords.h \ - acquire.h acquire-worker.h acquire-item.h acquire-method.h \ - clean.h srcrecords.h cachefile.h versionmatch.h policy.h \ - pkgsystem.h indexfile.h metaindex.h indexrecords.h vendor.h \ - vendorlist.h cdrom.h indexcopy.h aptconfiguration.h \ - cachefilter.h cacheset.h edsp.h install-progress.h \ - upgrade.h update.h - -# Source code for the debian specific components -# In theory the deb headers do not need to be exported.. -SOURCE+= deb/deblistparser.cc deb/debrecords.cc deb/dpkgpm.cc \ - deb/debsrcrecords.cc deb/debversion.cc deb/debsystem.cc \ - deb/debindexfile.cc deb/debindexfile.cc deb/debmetaindex.cc -HEADERS+= debversion.h debsrcrecords.h dpkgpm.h debrecords.h \ - deblistparser.h debsystem.h debindexfile.h debmetaindex.h - -# Source code for the APT resolver interface specific components -SOURCE+= edsp/edsplistparser.cc edsp/edspindexfile.cc edsp/edspsystem.cc -HEADERS+= edsplistparser.h edspindexfile.h edspsystem.h - -HEADERS := $(addprefix apt-pkg/,$(HEADERS)) +SOURCE = $(wildcard *.cc */*.cc) +HEADERS = $(addprefix apt-pkg/,$(notdir $(wildcard *.h */*.h))) include $(LIBRARY_H) diff --git a/apt-pkg/packagemanager.cc b/apt-pkg/packagemanager.cc index 5d6bc6bd2..393f836f7 100644 --- a/apt-pkg/packagemanager.cc +++ b/apt-pkg/packagemanager.cc @@ -261,7 +261,7 @@ bool pkgPackageManager::CheckRConflicts(PkgIterator Pkg,DepIterator D, if (Cache.VS().CheckDep(Ver,D->CompareOp,D.TargetVer()) == false) continue; - if (EarlyRemove(D.ParentPkg()) == false) + if (EarlyRemove(D.ParentPkg(), &D) == false) return _error->Error("Reverse conflicts early remove for package '%s' failed", Pkg.FullName().c_str()); } @@ -313,18 +313,41 @@ bool pkgPackageManager::ConfigureAll() return true; } /*}}}*/ +// PM::NonLoopingSmart - helper to avoid loops while calling Smart methods /*{{{*/ +// ----------------------------------------------------------------------- +/* ensures that a loop of the form A depends B, B depends A (and similar) + is not leading us down into infinite recursion segfault land */ +bool pkgPackageManager::NonLoopingSmart(SmartAction const action, pkgCache::PkgIterator &Pkg, + pkgCache::PkgIterator DepPkg, int const Depth, bool const PkgLoop, + bool * const Bad, bool * const Changed) +{ + if (PkgLoop == false) + List->Flag(Pkg,pkgOrderList::Loop); + bool success = false; + switch(action) + { + case UNPACK_IMMEDIATE: success = SmartUnPack(DepPkg, true, Depth + 1); break; + case UNPACK: success = SmartUnPack(DepPkg, false, Depth + 1); break; + case CONFIGURE: success = SmartConfigure(DepPkg, Depth + 1); break; + } + if (PkgLoop == false) + List->RmFlag(Pkg,pkgOrderList::Loop); + + if (success == false) + return false; + + if (Bad != NULL) + *Bad = false; + if (Changed != NULL && List->IsFlag(DepPkg,pkgOrderList::Loop) == false) + *Changed = true; + return true; +} + /*}}}*/ // PM::SmartConfigure - Perform immediate configuration of the pkg /*{{{*/ // --------------------------------------------------------------------- /* This function tries to put the system in a state where Pkg can be configured. - This involves checking each of Pkg's dependanies and unpacking and - configuring packages where needed. - - Note on failure: This method can fail, without causing any problems. - This can happen when using Immediate-Configure-All, SmartUnPack may call - SmartConfigure, it may fail because of a complex dependency situation, but - a error will only be reported if ConfigureAll fails. This is why some of the - messages this function reports on failure (return false;) as just warnings - only shown when debuging*/ + This involves checking each of Pkg's dependencies and unpacking and + configuring packages where needed. */ bool pkgPackageManager::SmartConfigure(PkgIterator Pkg, int const Depth) { // If this is true, only check and correct and dependencies without the Loop flag @@ -339,9 +362,9 @@ bool pkgPackageManager::SmartConfigure(PkgIterator Pkg, int const Depth) } VerIterator const instVer = Cache[Pkg].InstVerIter(Cache); - - /* Because of the ordered list, most dependencies should be unpacked, - however if there is a loop (A depends on B, B depends on A) this will not + + /* Because of the ordered list, most dependencies should be unpacked, + however if there is a loop (A depends on B, B depends on A) this will not be the case, so check for dependencies before configuring. */ bool Bad = false, Changed = false; const unsigned int max_loops = _config->FindI("APT::pkgPackageManager::MaxLoopCount", 5000); @@ -388,25 +411,15 @@ bool pkgPackageManager::SmartConfigure(PkgIterator Pkg, int const Depth) if (Debug) std::clog << OutputInDepth(Depth) << "Package " << Pkg << " loops in SmartConfigure" << std::endl; Bad = false; - break; } else { if (Debug) clog << OutputInDepth(Depth) << "Unpacking " << DepPkg.FullName() << " to avoid loop " << Cur << endl; - if (PkgLoop == false) - List->Flag(Pkg,pkgOrderList::Loop); - if (SmartUnPack(DepPkg, true, Depth + 1) == true) - { - Bad = false; - if (List->IsFlag(DepPkg,pkgOrderList::Loop) == false) - Changed = true; - } - if (PkgLoop == false) - List->RmFlag(Pkg,pkgOrderList::Loop); - if (Bad == false) - break; + if (NonLoopingSmart(UNPACK_IMMEDIATE, Pkg, DepPkg, Depth, PkgLoop, &Bad, &Changed) == false) + return false; } + break; } if (Cur == End || Bad == false) @@ -461,25 +474,12 @@ bool pkgPackageManager::SmartConfigure(PkgIterator Pkg, int const Depth) Bad = false; break; } - /* Check for a loop to prevent one forming - If A depends on B and B depends on A, SmartConfigure will - just hop between them if this is not checked. Dont remove the - loop flag after finishing however as loop is already set. - This means that there is another SmartConfigure call for this - package and it will remove the loop flag */ - if (PkgLoop == false) - List->Flag(Pkg,pkgOrderList::Loop); - if (SmartConfigure(DepPkg, Depth + 1) == true) - { - Bad = false; - if (List->IsFlag(DepPkg,pkgOrderList::Loop) == false) - Changed = true; - } - if (PkgLoop == false) - List->RmFlag(Pkg,pkgOrderList::Loop); - // If SmartConfigure was succesfull, Bad is false, so break - if (Bad == false) - break; + if (Debug) + std::clog << OutputInDepth(Depth) << "Configure already unpacked " << DepPkg << std::endl; + if (NonLoopingSmart(CONFIGURE, Pkg, DepPkg, Depth, PkgLoop, &Bad, &Changed) == false) + return false; + break; + } else if (List->IsFlag(DepPkg,pkgOrderList::Configured)) { @@ -498,19 +498,16 @@ bool pkgPackageManager::SmartConfigure(PkgIterator Pkg, int const Depth) if (i++ > max_loops) return _error->Error("Internal error: MaxLoopCount reached in SmartUnPack (2) for %s, aborting", Pkg.FullName().c_str()); } while (Changed == true); - - if (Bad) { - if (Debug) - _error->Warning(_("Could not configure '%s'. "),Pkg.FullName().c_str()); - return false; - } - + + if (Bad == true) + return _error->Error(_("Could not configure '%s'. "),Pkg.FullName().c_str()); + if (PkgLoop) return true; static std::string const conf = _config->Find("PackageManager::Configure","all"); static bool const ConfigurePkgs = (conf == "all" || conf == "smart"); - if (List->IsFlag(Pkg,pkgOrderList::Configured)) + if (List->IsFlag(Pkg,pkgOrderList::Configured)) return _error->Error("Internal configure error on '%s'.", Pkg.FullName().c_str()); if (ConfigurePkgs == true && Configure(Pkg) == false) @@ -527,7 +524,8 @@ bool pkgPackageManager::SmartConfigure(PkgIterator Pkg, int const Depth) Cache[P].InstallVer == 0 || (P.CurrentVer() == Cache[P].InstallVer && (Cache[Pkg].iFlags & pkgDepCache::ReInstall) != pkgDepCache::ReInstall)) continue; - SmartConfigure(P, (Depth +1)); + if (SmartConfigure(P, (Depth +1)) == false) + return false; } // Sanity Check @@ -541,29 +539,37 @@ bool pkgPackageManager::SmartConfigure(PkgIterator Pkg, int const Depth) // --------------------------------------------------------------------- /* This is called to deal with conflicts arising from unpacking */ bool pkgPackageManager::EarlyRemove(PkgIterator Pkg) +{ + return EarlyRemove(Pkg, NULL); +} +bool pkgPackageManager::EarlyRemove(PkgIterator Pkg, DepIterator const * const Dep) { if (List->IsNow(Pkg) == false) return true; - + // Already removed it if (List->IsFlag(Pkg,pkgOrderList::Removed) == true) return true; - + // Woops, it will not be re-installed! if (List->IsFlag(Pkg,pkgOrderList::InList) == false) return false; + // these breaks on M-A:same packages can be dealt with. They 'loop' by design + if (Dep != NULL && (*Dep)->Type == pkgCache::Dep::DpkgBreaks && Dep->IsMultiArchImplicit() == true) + return true; + // Essential packages get special treatment bool IsEssential = false; if ((Pkg->Flags & pkgCache::Flag::Essential) != 0 || (Pkg->Flags & pkgCache::Flag::Important) != 0) IsEssential = true; - /* Check for packages that are the dependents of essential packages and + /* Check for packages that are the dependents of essential packages and promote them too */ if (Pkg->CurrentVer != 0) { - for (DepIterator D = Pkg.RevDependsList(); D.end() == false && + for (pkgCache::DepIterator D = Pkg.RevDependsList(); D.end() == false && IsEssential == false; ++D) if (D->Type == pkgCache::Dep::Depends || D->Type == pkgCache::Dep::PreDepends) if ((D.ParentPkg()->Flags & pkgCache::Flag::Essential) != 0 || @@ -580,11 +586,14 @@ bool pkgPackageManager::EarlyRemove(PkgIterator Pkg) "but if you really want to do it, activate the " "APT::Force-LoopBreak option."),Pkg.FullName().c_str()); } - + // dpkg will auto-deconfigure it, no need for the big remove hammer + else if (Dep != NULL && (*Dep)->Type == pkgCache::Dep::DpkgBreaks) + return true; + bool Res = SmartRemove(Pkg); if (Cache[Pkg].Delete() == false) List->Flag(Pkg,pkgOrderList::Removed,pkgOrderList::States); - + return Res; } /*}}}*/ @@ -629,13 +638,14 @@ bool pkgPackageManager::SmartUnPack(PkgIterator Pkg, bool const Immediate, int c VerIterator const instVer = Cache[Pkg].InstVerIter(Cache); - /* PreUnpack Checks: This loop checks and attempts to rectify and problems that would prevent the package being unpacked. + /* PreUnpack Checks: This loop checks and attempts to rectify any problems that would prevent the package being unpacked. It addresses: PreDepends, Conflicts, Obsoletes and Breaks (DpkgBreaks). Any resolutions that do not require it should avoid configuration (calling SmartUnpack with Immediate=true), this is because when unpacking some packages with - complex dependency structures, trying to configure some packages while breaking the loops can complicate things . + complex dependency structures, trying to configure some packages while breaking the loops can complicate things. This will be either dealt with if the package is configured as a dependency of Pkg (if and when Pkg is configured), or by the ConfigureAll call at the end of the for loop in OrderInstall. */ - bool Changed = false; + bool SomethingBad = false, Changed = false; + bool couldBeTemporaryRemoved = Depth != 0 && List->IsFlag(Pkg,pkgOrderList::Removed) == false; const unsigned int max_loops = _config->FindI("APT::pkgPackageManager::MaxLoopCount", 5000); unsigned int i = 0; do @@ -683,184 +693,142 @@ bool pkgPackageManager::SmartUnPack(PkgIterator Pkg, bool const Immediate, int c for (Version **I = VList; *I != 0; ++I) { VerIterator Ver(Cache,*I); - PkgIterator Pkg = Ver.ParentPkg(); + PkgIterator DepPkg = Ver.ParentPkg(); // Not the install version - if (Cache[Pkg].InstallVer != *I || - (Cache[Pkg].Keep() == true && Pkg.State() == PkgIterator::NeedsNothing)) + if (Cache[DepPkg].InstallVer != *I || + (Cache[DepPkg].Keep() == true && DepPkg.State() == PkgIterator::NeedsNothing)) continue; - if (List->IsFlag(Pkg,pkgOrderList::Configured)) + if (List->IsFlag(DepPkg,pkgOrderList::Configured)) { Bad = false; break; } // check if it needs unpack or if if configure is enough - if (List->IsFlag(Pkg,pkgOrderList::UnPacked) == false) + if (List->IsFlag(DepPkg,pkgOrderList::UnPacked) == false) { if (Debug) - clog << OutputInDepth(Depth) << "Trying to SmartUnpack " << Pkg.FullName() << endl; - // SmartUnpack with the ImmediateFlag to ensure its really ready - if (SmartUnPack(Pkg, true, Depth + 1) == true) - { - Bad = false; - if (List->IsFlag(Pkg,pkgOrderList::Loop) == false) - Changed = true; - break; - } + clog << OutputInDepth(Depth) << "Trying to SmartUnpack " << DepPkg.FullName() << endl; + if (NonLoopingSmart(UNPACK_IMMEDIATE, Pkg, DepPkg, Depth, PkgLoop, &Bad, &Changed) == false) + return false; } else { if (Debug) - clog << OutputInDepth(Depth) << "Trying to SmartConfigure " << Pkg.FullName() << endl; - if (SmartConfigure(Pkg, Depth + 1) == true) - { - Bad = false; - if (List->IsFlag(Pkg,pkgOrderList::Loop) == false) - Changed = true; - break; - } + clog << OutputInDepth(Depth) << "Trying to SmartConfigure " << DepPkg.FullName() << endl; + if (NonLoopingSmart(CONFIGURE, Pkg, DepPkg, Depth, PkgLoop, &Bad, &Changed) == false) + return false; } + break; } } if (Bad == true) - { - if (Start == End) - return _error->Error("Couldn't configure pre-depend %s for %s, " - "probably a dependency cycle.", - End.TargetPkg().FullName().c_str(),Pkg.FullName().c_str()); - } - else - continue; + SomethingBad = true; } else if (End->Type == pkgCache::Dep::Conflicts || - End->Type == pkgCache::Dep::Obsoletes) + End->Type == pkgCache::Dep::Obsoletes || + End->Type == pkgCache::Dep::DpkgBreaks) { - /* Look for conflicts. Two packages that are both in the install - state cannot conflict so we don't check.. */ SPtrArray VList = End.AllTargets(); - for (Version **I = VList; *I != 0; I++) + for (Version **I = VList; *I != 0; ++I) { VerIterator Ver(Cache,*I); PkgIterator ConflictPkg = Ver.ParentPkg(); - VerIterator InstallVer(Cache,Cache[ConflictPkg].InstallVer); + if (ConflictPkg.CurrentVer() != Ver) + { + if (Debug) + std::clog << OutputInDepth(Depth) << "Ignore not-installed version " << Ver.VerStr() << " of " << ConflictPkg.FullName() << " for " << End << std::endl; + continue; + } - // See if the current version is conflicting - if (ConflictPkg.CurrentVer() == Ver && List->IsNow(ConflictPkg)) + if (List->IsNow(ConflictPkg) == false) { if (Debug) - clog << OutputInDepth(Depth) << Pkg.FullName() << " conflicts with " << ConflictPkg.FullName() << endl; - /* If a loop is not present or has not yet been detected, attempt to unpack packages - to resolve this conflict. If there is a loop present, remove packages to resolve this conflict */ - if (List->IsFlag(ConflictPkg,pkgOrderList::Loop) == false) - { - if (Cache[ConflictPkg].Keep() == 0 && Cache[ConflictPkg].InstallVer != 0) - { - if (Debug) - clog << OutputInDepth(Depth) << OutputInDepth(Depth) << "Unpacking " << ConflictPkg.FullName() << " to prevent conflict" << endl; - List->Flag(Pkg,pkgOrderList::Loop); - if (SmartUnPack(ConflictPkg,false, Depth + 1) == true) - if (List->IsFlag(ConflictPkg,pkgOrderList::Loop) == false) - Changed = true; - // Remove loop to allow it to be used later if needed - List->RmFlag(Pkg,pkgOrderList::Loop); - } - else if (EarlyRemove(ConflictPkg) == false) - return _error->Error("Internal Error, Could not early remove %s (1)",ConflictPkg.FullName().c_str()); - } - else if (List->IsFlag(ConflictPkg,pkgOrderList::Removed) == false) + std::clog << OutputInDepth(Depth) << "Ignore already dealt-with version " << Ver.VerStr() << " of " << ConflictPkg.FullName() << " for " << End << std::endl; + continue; + } + + if (List->IsFlag(ConflictPkg,pkgOrderList::Removed) == true) + { + if (Debug) + clog << OutputInDepth(Depth) << "Ignoring " << End << " as " << ConflictPkg.FullName() << "was temporarily removed" << endl; + continue; + } + + if (List->IsFlag(ConflictPkg,pkgOrderList::Loop) && PkgLoop) + { + if (End->Type == pkgCache::Dep::DpkgBreaks && End.IsMultiArchImplicit() == true) { if (Debug) - clog << OutputInDepth(Depth) << "Because of conficts knot, removing " << ConflictPkg.FullName() << " to conflict violation" << endl; - if (EarlyRemove(ConflictPkg) == false) - return _error->Error("Internal Error, Could not early remove %s (2)",ConflictPkg.FullName().c_str()); + clog << OutputInDepth(Depth) << "Because dependency is MultiArchImplicit we ignored looping on: " << ConflictPkg << endl; + continue; } - } - } - } - else if (End->Type == pkgCache::Dep::DpkgBreaks) - { - SPtrArray VList = End.AllTargets(); - for (Version **I = VList; *I != 0; ++I) - { - VerIterator Ver(Cache,*I); - PkgIterator BrokenPkg = Ver.ParentPkg(); - if (BrokenPkg.CurrentVer() != Ver) - { if (Debug) - std::clog << OutputInDepth(Depth) << " Ignore not-installed version " << Ver.VerStr() << " of " << Pkg.FullName() << " for " << End << std::endl; + { + if (End->Type == pkgCache::Dep::DpkgBreaks) + clog << OutputInDepth(Depth) << "Because of breaks knot, deconfigure " << ConflictPkg.FullName() << " temporarily" << endl; + else + clog << OutputInDepth(Depth) << "Because of conflict knot, removing " << ConflictPkg.FullName() << " temporarily" << endl; + } + if (EarlyRemove(ConflictPkg, &End) == false) + return _error->Error("Internal Error, Could not early remove %s (2)",ConflictPkg.FullName().c_str()); + SomethingBad = true; continue; } - // Check if it needs to be unpacked - if (List->IsFlag(BrokenPkg,pkgOrderList::InList) && Cache[BrokenPkg].Delete() == false && - List->IsNow(BrokenPkg)) + if (Cache[ConflictPkg].Delete() == false) { - if (List->IsFlag(BrokenPkg,pkgOrderList::Loop) && PkgLoop) + if (Debug) { - // This dependency has already been dealt with by another SmartUnPack on Pkg - break; + clog << OutputInDepth(Depth) << "Unpacking " << ConflictPkg.FullName() << " to avoid " << End; + if (PkgLoop == true) + clog << " (Looping)"; + clog << std::endl; } - else + // we would like to avoid temporary removals and all that at best via a simple unpack + _error->PushToStack(); + if (NonLoopingSmart(UNPACK, Pkg, ConflictPkg, Depth, PkgLoop, NULL, &Changed) == false) { - // Found a break, so see if we can unpack the package to avoid it - // but do not set loop if another SmartUnPack already deals with it - // Also, avoid it if the package we would unpack pre-depends on this one - VerIterator InstallVer(Cache,Cache[BrokenPkg].InstallVer); - bool circle = false; - for (pkgCache::DepIterator D = InstallVer.DependsList(); D.end() == false; ++D) + // but if it fails ignore this failure and look for alternative ways of solving + if (Debug) + { + clog << OutputInDepth(Depth) << "Avoidance unpack of " << ConflictPkg.FullName() << " failed for " << End << std::endl; + _error->DumpErrors(std::clog); + } + _error->RevertToStack(); + // ignorance can only happen if a) one of the offenders is already gone + if (List->IsFlag(ConflictPkg,pkgOrderList::Removed) == true) { - if (D->Type != pkgCache::Dep::PreDepends) - continue; - SPtrArray VL = D.AllTargets(); - for (Version **I = VL; *I != 0; ++I) - { - VerIterator V(Cache,*I); - PkgIterator P = V.ParentPkg(); - // we are checking for installation as an easy 'protection' against or-groups and (unchosen) providers - if (P != Pkg || (P.CurrentVer() != V && Cache[P].InstallVer != V)) - continue; - circle = true; - break; - } - if (circle == true) - break; + if (Debug) + clog << OutputInDepth(Depth) << "But " << ConflictPkg.FullName() << " was temporarily removed in the meantime to satisfy " << End << endl; } - if (circle == true) + else if (List->IsFlag(Pkg,pkgOrderList::Removed) == true) { if (Debug) - clog << OutputInDepth(Depth) << " Avoiding " << End << " avoided as " << BrokenPkg.FullName() << " has a pre-depends on " << Pkg.FullName() << std::endl; - continue; + clog << OutputInDepth(Depth) << "But " << Pkg.FullName() << " was temporarily removed in the meantime to satisfy " << End << endl; } + // or b) we can make one go (removal or dpkg auto-deconfigure) else { if (Debug) - { - clog << OutputInDepth(Depth) << " Unpacking " << BrokenPkg.FullName() << " to avoid " << End; - if (PkgLoop == true) - clog << " (Looping)"; - clog << std::endl; - } - if (PkgLoop == false) - List->Flag(Pkg,pkgOrderList::Loop); - if (SmartUnPack(BrokenPkg, false, Depth + 1) == true) - { - if (List->IsFlag(BrokenPkg,pkgOrderList::Loop) == false) - Changed = true; - } - if (PkgLoop == false) - List->RmFlag(Pkg,pkgOrderList::Loop); + clog << OutputInDepth(Depth) << "So temprorary remove/deconfigure " << ConflictPkg.FullName() << " to satisfy " << End << endl; + if (EarlyRemove(ConflictPkg, &End) == false) + return _error->Error("Internal Error, Could not early remove %s (2)",ConflictPkg.FullName().c_str()); } } + else + _error->MergeWithStack(); } - // Check if a package needs to be removed - else if (Cache[BrokenPkg].Delete() == true && List->IsFlag(BrokenPkg,pkgOrderList::Configured) == false) + else { if (Debug) - clog << OutputInDepth(Depth) << " Removing " << BrokenPkg.FullName() << " to avoid " << End << endl; - SmartRemove(BrokenPkg); + clog << OutputInDepth(Depth) << "Removing " << ConflictPkg.FullName() << " now to avoid " << End << endl; + // no earlyremove() here as user has already agreed to the permanent removal + if (SmartRemove(Pkg) == false) + return _error->Error("Internal Error, Could not early remove %s (1)",ConflictPkg.FullName().c_str()); } } } @@ -868,7 +836,17 @@ bool pkgPackageManager::SmartUnPack(PkgIterator Pkg, bool const Immediate, int c if (i++ > max_loops) return _error->Error("Internal error: APT::pkgPackageManager::MaxLoopCount reached in SmartConfigure for %s, aborting", Pkg.FullName().c_str()); } while (Changed == true); - + + if (SomethingBad == true) + return _error->Error("Couldn't configure %s, probably a dependency cycle.", Pkg.FullName().c_str()); + + if (couldBeTemporaryRemoved == true && List->IsFlag(Pkg,pkgOrderList::Removed) == true) + { + if (Debug) + std::clog << OutputInDepth(Depth) << "Prevent unpack as " << Pkg << " is currently temporarily removed" << std::endl; + return true; + } + // Check for reverse conflicts. if (CheckRConflicts(Pkg,Pkg.RevDependsList(), instVer.VerStr()) == false) @@ -929,7 +907,7 @@ bool pkgPackageManager::SmartUnPack(PkgIterator Pkg, bool const Immediate, int c if (Immediate == true) { // Perform immedate configuration of the package. if (SmartConfigure(Pkg, Depth + 1) == false) - _error->Warning(_("Could not perform immediate configuration on '%s'. " + _error->Error(_("Could not perform immediate configuration on '%s'. " "Please see man 5 apt.conf under APT::Immediate-Configure for details. (%d)"),Pkg.FullName().c_str(),2); } diff --git a/apt-pkg/packagemanager.h b/apt-pkg/packagemanager.h index 344ed9192..d72790b6e 100644 --- a/apt-pkg/packagemanager.h +++ b/apt-pkg/packagemanager.h @@ -79,13 +79,14 @@ class pkgPackageManager : protected pkgCache::Namespace // Install helpers bool ConfigureAll(); - bool SmartConfigure(PkgIterator Pkg, int const Depth); + bool SmartConfigure(PkgIterator Pkg, int const Depth) APT_MUSTCHECK; //FIXME: merge on abi break - bool SmartUnPack(PkgIterator Pkg); - bool SmartUnPack(PkgIterator Pkg, bool const Immediate, int const Depth); - bool SmartRemove(PkgIterator Pkg); - bool EarlyRemove(PkgIterator Pkg); - + bool SmartUnPack(PkgIterator Pkg) APT_MUSTCHECK; + bool SmartUnPack(PkgIterator Pkg, bool const Immediate, int const Depth) APT_MUSTCHECK; + bool SmartRemove(PkgIterator Pkg) APT_MUSTCHECK; + bool EarlyRemove(PkgIterator Pkg, DepIterator const * const Dep) APT_MUSTCHECK; + APT_DEPRECATED bool EarlyRemove(PkgIterator Pkg) APT_MUSTCHECK; + // The Actual installation implementation virtual bool Install(PkgIterator /*Pkg*/,std::string /*File*/) {return false;}; virtual bool Configure(PkgIterator /*Pkg*/) {return false;}; @@ -139,6 +140,12 @@ class pkgPackageManager : protected pkgCache::Namespace pkgPackageManager(pkgDepCache *Cache); virtual ~pkgPackageManager(); + + private: + enum APT_HIDDEN SmartAction { UNPACK_IMMEDIATE, UNPACK, CONFIGURE }; + APT_HIDDEN bool NonLoopingSmart(SmartAction const action, pkgCache::PkgIterator &Pkg, + pkgCache::PkgIterator DepPkg, int const Depth, bool const PkgLoop, + bool * const Bad, bool * const Changed) APT_MUSTCHECK; }; #endif diff --git a/apt-pkg/pkgcache.cc b/apt-pkg/pkgcache.cc index 91b75f52e..58a63459f 100644 --- a/apt-pkg/pkgcache.cc +++ b/apt-pkg/pkgcache.cc @@ -822,7 +822,7 @@ int pkgCache::VerIterator::CompareVer(const VerIterator &B) const // VerIterator::Downloadable - Checks if the version is downloadable /*{{{*/ // --------------------------------------------------------------------- /* */ -bool pkgCache::VerIterator::Downloadable() const +APT_PURE bool pkgCache::VerIterator::Downloadable() const { VerFileIterator Files = FileList(); for (; Files.end() == false; ++Files) @@ -835,7 +835,7 @@ bool pkgCache::VerIterator::Downloadable() const // --------------------------------------------------------------------- /* This checks to see if any of the versions files are not NotAutomatic. True if this version is selectable for automatic installation. */ -bool pkgCache::VerIterator::Automatic() const +APT_PURE bool pkgCache::VerIterator::Automatic() const { VerFileIterator Files = FileList(); for (; Files.end() == false; ++Files) diff --git a/apt-pkg/srcrecords.cc b/apt-pkg/srcrecords.cc index 775cf2e5f..f4d034b85 100644 --- a/apt-pkg/srcrecords.cc +++ b/apt-pkg/srcrecords.cc @@ -81,6 +81,27 @@ bool pkgSrcRecords::Restart() return true; } /*}}}*/ +// SrcRecords::Next - Step to the next Source Record /*{{{*/ +// --------------------------------------------------------------------- +/* Step to the next source package record */ +const pkgSrcRecords::Parser* pkgSrcRecords::Next() +{ + if (Current == Files.end()) + return 0; + + // Step to the next record, possibly switching files + while ((*Current)->Step() == false) + { + if (_error->PendingError() == true) + return 0; + ++Current; + if (Current == Files.end()) + return 0; + } + + return *Current; +} + /*}}}*/ // SrcRecords::Find - Find the first source package with the given name /*{{{*/ // --------------------------------------------------------------------- /* This searches on both source package names and output binary names and @@ -88,21 +109,11 @@ bool pkgSrcRecords::Restart() function to be called multiple times to get successive entries */ pkgSrcRecords::Parser *pkgSrcRecords::Find(const char *Package,bool const &SrcOnly) { - if (Current == Files.end()) - return 0; - while (true) { - // Step to the next record, possibly switching files - while ((*Current)->Step() == false) - { - if (_error->PendingError() == true) - return 0; - ++Current; - if (Current == Files.end()) - return 0; - } - + if(Next() == 0) + return 0; + // IO error somehow if (_error->PendingError() == true) return 0; diff --git a/apt-pkg/srcrecords.h b/apt-pkg/srcrecords.h index 9915debfe..82460d70f 100644 --- a/apt-pkg/srcrecords.h +++ b/apt-pkg/srcrecords.h @@ -95,8 +95,13 @@ class pkgSrcRecords // Reset the search bool Restart(); - // Locate a package by name - Parser *Find(const char *Package,bool const &SrcOnly = false); + // Step to the next SourcePackage and return pointer to the + // next SourceRecord. The pointer is owned by libapt. + const Parser* Next(); + + // Locate a package by name and return pointer to the Parser. + // The pointer is owned by libapt. + Parser* Find(const char *Package,bool const &SrcOnly = false); pkgSrcRecords(pkgSourceList &List); virtual ~pkgSrcRecords(); diff --git a/apt-private/makefile b/apt-private/makefile index 09736c6d3..9a3fbdb29 100644 --- a/apt-private/makefile +++ b/apt-private/makefile @@ -8,9 +8,6 @@ HEADER_TARGETDIRS = apt-private # Bring in the default rules include ../buildlib/defaults.mak -# The library name and version (indirectly used from init.h) -include ../buildlib/libversion.mak - # The library name LIBRARY=apt-private MAJOR=0.0 @@ -18,12 +15,7 @@ MINOR=0 SLIBS=$(PTHREADLIB) -lapt-pkg CXXFLAGS += -fvisibility=hidden -fvisibility-inlines-hidden -PRIVATES=list install download output cachefile cacheset update upgrade cmndline moo search show main utils sources -SOURCE += $(foreach private, $(PRIVATES), private-$(private).cc) -HEADERS += $(foreach private, $(PRIVATES), private-$(private).h) - -SOURCE+= acqprogress.cc -HEADERS+= acqprogress.h private-cacheset.h +SOURCE = $(wildcard *.cc) +HEADERS = $(addprefix apt-private/,$(wildcard *.h)) -HEADERS := $(addprefix apt-private/,$(HEADERS)) include $(LIBRARY_H) diff --git a/apt-private/private-cacheset.cc b/apt-private/private-cacheset.cc index 4a63c7e81..e37e7b227 100644 --- a/apt-private/private-cacheset.cc +++ b/apt-private/private-cacheset.cc @@ -73,7 +73,13 @@ bool GetLocalitySortedVersionSet(pkgCacheFile &CacheFile, else { pkgPolicy *policy = CacheFile.GetPolicy(); - output_set.insert(policy->GetCandidateVer(P)); + if (policy->GetCandidateVer(P).IsGood()) + output_set.insert(policy->GetCandidateVer(P)); + else + // no candidate, this may happen for packages in + // dpkg "deinstall ok config-file" state - we pick the first ver + // (which should be the only one) + output_set.insert(P.VersionList()); } } progress.Done(); diff --git a/apt-private/private-cmndline.cc b/apt-private/private-cmndline.cc index 682be0a19..a21a9dc8c 100644 --- a/apt-private/private-cmndline.cc +++ b/apt-private/private-cmndline.cc @@ -116,7 +116,7 @@ static bool addArgumentsAPTConfig(std::vector &Args, char con static bool addArgumentsAPTGet(std::vector &Args, char const * const Cmd)/*{{{*/ { if (CmdMatches("install", "remove", "purge", "upgrade", "dist-upgrade", - "dselect-upgrade", "autoremove")) + "dselect-upgrade", "autoremove", "full-upgrade")) { addArg(0, "show-progress", "DpkgPM::Progress", 0); addArg('f', "fix-broken", "APT::Get::Fix-Broken", 0); @@ -163,7 +163,7 @@ static bool addArgumentsAPTGet(std::vector &Args, char const if (CmdMatches("install", "remove", "purge", "upgrade", "dist-upgrade", "deselect-upgrade", "autoremove", "clean", "autoclean", "check", - "build-dep")) + "build-dep", "full-upgrade")) { addArg('s', "simulate", "APT::Get::Simulate", 0); addArg('s', "just-print", "APT::Get::Simulate", 0); diff --git a/apt-private/private-download.cc b/apt-private/private-download.cc index a095f0c67..be7d23c31 100644 --- a/apt-private/private-download.cc +++ b/apt-private/private-download.cc @@ -28,6 +28,11 @@ bool CheckAuth(pkgAcquire& Fetcher, bool const PromptUser) if (UntrustedList == "") return true; + return AuthPrompt(UntrustedList, PromptUser); +} + +bool AuthPrompt(std::string UntrustedList, bool const PromptUser) +{ ShowList(c2out,_("WARNING: The following packages cannot be authenticated!"),UntrustedList,""); if (_config->FindB("APT::Get::AllowUnauthenticated",false) == true) diff --git a/apt-private/private-download.h b/apt-private/private-download.h index a108aa531..a90ac7eaa 100644 --- a/apt-private/private-download.h +++ b/apt-private/private-download.h @@ -5,7 +5,13 @@ class pkgAcquire; +// Check if all files in the fetcher are authenticated APT_PUBLIC bool CheckAuth(pkgAcquire& Fetcher, bool const PromptUser); + +// show a authentication warning prompt and return true if the system +// should continue +APT_PUBLIC bool AuthPrompt(std::string UntrustedList, bool const PromptUser); + APT_PUBLIC bool AcquireRun(pkgAcquire &Fetcher, int const PulseInterval, bool * const Failure, bool * const TransientNetworkFailure); #endif diff --git a/apt-private/private-install.cc b/apt-private/private-install.cc index 107ed398e..a365d4294 100644 --- a/apt-private/private-install.cc +++ b/apt-private/private-install.cc @@ -617,7 +617,8 @@ bool DoCacheManipulationFromCommandLine(CommandLine &CmdL, CacheFile &Cache, if (Fix != NULL) { // Call the scored problem resolver - Fix->Resolve(true); + if (Fix->Resolve(true) == false && Cache->BrokenCount() == 0) + return false; } // Now we check the state of the packages, @@ -801,3 +802,144 @@ bool DoInstall(CommandLine &CmdL) return InstallPackages(Cache,false); } /*}}}*/ + +// TryToInstall - Mark a package for installation /*{{{*/ +void TryToInstall::operator() (pkgCache::VerIterator const &Ver) { + pkgCache::PkgIterator Pkg = Ver.ParentPkg(); + + Cache->GetDepCache()->SetCandidateVersion(Ver); + pkgDepCache::StateCache &State = (*Cache)[Pkg]; + + // Handle the no-upgrade case + if (_config->FindB("APT::Get::upgrade",true) == false && Pkg->CurrentVer != 0) + ioprintf(c1out,_("Skipping %s, it is already installed and upgrade is not set.\n"), + Pkg.FullName(true).c_str()); + // Ignore request for install if package would be new + else if (_config->FindB("APT::Get::Only-Upgrade", false) == true && Pkg->CurrentVer == 0) + ioprintf(c1out,_("Skipping %s, it is not installed and only upgrades are requested.\n"), + Pkg.FullName(true).c_str()); + else { + if (Fix != NULL) { + Fix->Clear(Pkg); + Fix->Protect(Pkg); + } + Cache->GetDepCache()->MarkInstall(Pkg,false); + + if (State.Install() == false) { + if (_config->FindB("APT::Get::ReInstall",false) == true) { + if (Pkg->CurrentVer == 0 || Pkg.CurrentVer().Downloadable() == false) + ioprintf(c1out,_("Reinstallation of %s is not possible, it cannot be downloaded.\n"), + Pkg.FullName(true).c_str()); + else + Cache->GetDepCache()->SetReInstall(Pkg, true); + } else + ioprintf(c1out,_("%s is already the newest version.\n"), + Pkg.FullName(true).c_str()); + } + + // Install it with autoinstalling enabled (if we not respect the minial + // required deps or the policy) + if (FixBroken == false) + doAutoInstallLater.insert(Pkg); + } + + // see if we need to fix the auto-mark flag + // e.g. apt-get install foo + // where foo is marked automatic + if (State.Install() == false && + (State.Flags & pkgCache::Flag::Auto) && + _config->FindB("APT::Get::ReInstall",false) == false && + _config->FindB("APT::Get::Only-Upgrade",false) == false && + _config->FindB("APT::Get::Download-Only",false) == false) + { + ioprintf(c1out,_("%s set to manually installed.\n"), + Pkg.FullName(true).c_str()); + Cache->GetDepCache()->MarkAuto(Pkg,false); + AutoMarkChanged++; + } +} + /*}}}*/ +bool TryToInstall::propergateReleaseCandiateSwitching(std::list > const &start, std::ostream &out)/*{{{*/ +{ + for (std::list >::const_iterator s = start.begin(); + s != start.end(); ++s) + Cache->GetDepCache()->SetCandidateVersion(s->first); + + bool Success = true; + // the Changed list contains: + // first: "new version" + // second: "what-caused the change" + std::list > Changed; + for (std::list >::const_iterator s = start.begin(); + s != start.end(); ++s) + { + Changed.push_back(std::make_pair(s->first, pkgCache::VerIterator(*Cache))); + // We continue here even if it failed to enhance the ShowBroken output + Success &= Cache->GetDepCache()->SetCandidateRelease(s->first, s->second, Changed); + } + for (std::list >::const_iterator c = Changed.begin(); + c != Changed.end(); ++c) + { + if (c->second.end() == true) + ioprintf(out, _("Selected version '%s' (%s) for '%s'\n"), + c->first.VerStr(), c->first.RelStr().c_str(), c->first.ParentPkg().FullName(true).c_str()); + else if (c->first.ParentPkg()->Group != c->second.ParentPkg()->Group) + { + pkgCache::VerIterator V = (*Cache)[c->first.ParentPkg()].CandidateVerIter(*Cache); + ioprintf(out, _("Selected version '%s' (%s) for '%s' because of '%s'\n"), V.VerStr(), + V.RelStr().c_str(), V.ParentPkg().FullName(true).c_str(), c->second.ParentPkg().FullName(true).c_str()); + } + } + return Success; +} + /*}}}*/ +void TryToInstall::doAutoInstall() { /*{{{*/ + for (APT::PackageSet::const_iterator P = doAutoInstallLater.begin(); + P != doAutoInstallLater.end(); ++P) { + pkgDepCache::StateCache &State = (*Cache)[P]; + if (State.InstBroken() == false && State.InstPolicyBroken() == false) + continue; + Cache->GetDepCache()->MarkInstall(P, true); + } + doAutoInstallLater.clear(); +} + /*}}}*/ +// TryToRemove - Mark a package for removal /*{{{*/ +void TryToRemove::operator() (pkgCache::VerIterator const &Ver) +{ + pkgCache::PkgIterator Pkg = Ver.ParentPkg(); + + if (Fix != NULL) + { + Fix->Clear(Pkg); + Fix->Protect(Pkg); + Fix->Remove(Pkg); + } + + if ((Pkg->CurrentVer == 0 && PurgePkgs == false) || + (PurgePkgs == true && Pkg->CurrentState == pkgCache::State::NotInstalled)) + { + pkgCache::GrpIterator Grp = Pkg.Group(); + pkgCache::PkgIterator P = Grp.PackageList(); + for (; P.end() != true; P = Grp.NextPkg(P)) + { + if (P == Pkg) + continue; + if (P->CurrentVer != 0 || (PurgePkgs == true && P->CurrentState != pkgCache::State::NotInstalled)) + { + // TRANSLATORS: Note, this is not an interactive question + ioprintf(c1out,_("Package '%s' is not installed, so not removed. Did you mean '%s'?\n"), + Pkg.FullName(true).c_str(), P.FullName(true).c_str()); + break; + } + } + if (P.end() == true) + ioprintf(c1out,_("Package '%s' is not installed, so not removed\n"),Pkg.FullName(true).c_str()); + + // MarkInstall refuses to install packages on hold + Pkg->SelectedState = pkgCache::State::Hold; + } + else + Cache->GetDepCache()->MarkDelete(Pkg, PurgePkgs); +} + /*}}}*/ diff --git a/apt-private/private-install.h b/apt-private/private-install.h index 5e18560c5..828163e40 100644 --- a/apt-private/private-install.h +++ b/apt-private/private-install.h @@ -3,28 +3,18 @@ #include #include -#include #include #include #include -#include -#include #include -#include - -#include -#include #include -#include #include #include - -#include - class CacheFile; class CommandLine; +class pkgProblemResolver; #define RAMFS_MAGIC 0x858458f6 @@ -39,7 +29,7 @@ APT_PUBLIC bool InstallPackages(CacheFile &Cache,bool ShwKept,bool Ask = true, // TryToInstall - Mark a package for installation /*{{{*/ -struct TryToInstall { +struct APT_PUBLIC TryToInstall { pkgCacheFile* Cache; pkgProblemResolver* Fix; bool FixBroken; @@ -49,109 +39,13 @@ struct TryToInstall { TryToInstall(pkgCacheFile &Cache, pkgProblemResolver *PM, bool const FixBroken) : Cache(&Cache), Fix(PM), FixBroken(FixBroken), AutoMarkChanged(0) {}; - void operator() (pkgCache::VerIterator const &Ver) { - pkgCache::PkgIterator Pkg = Ver.ParentPkg(); - - Cache->GetDepCache()->SetCandidateVersion(Ver); - pkgDepCache::StateCache &State = (*Cache)[Pkg]; - - // Handle the no-upgrade case - if (_config->FindB("APT::Get::upgrade",true) == false && Pkg->CurrentVer != 0) - ioprintf(c1out,_("Skipping %s, it is already installed and upgrade is not set.\n"), - Pkg.FullName(true).c_str()); - // Ignore request for install if package would be new - else if (_config->FindB("APT::Get::Only-Upgrade", false) == true && Pkg->CurrentVer == 0) - ioprintf(c1out,_("Skipping %s, it is not installed and only upgrades are requested.\n"), - Pkg.FullName(true).c_str()); - else { - if (Fix != NULL) { - Fix->Clear(Pkg); - Fix->Protect(Pkg); - } - Cache->GetDepCache()->MarkInstall(Pkg,false); - - if (State.Install() == false) { - if (_config->FindB("APT::Get::ReInstall",false) == true) { - if (Pkg->CurrentVer == 0 || Pkg.CurrentVer().Downloadable() == false) - ioprintf(c1out,_("Reinstallation of %s is not possible, it cannot be downloaded.\n"), - Pkg.FullName(true).c_str()); - else - Cache->GetDepCache()->SetReInstall(Pkg, true); - } else - ioprintf(c1out,_("%s is already the newest version.\n"), - Pkg.FullName(true).c_str()); - } - - // Install it with autoinstalling enabled (if we not respect the minial - // required deps or the policy) - if (FixBroken == false) - doAutoInstallLater.insert(Pkg); - } - - // see if we need to fix the auto-mark flag - // e.g. apt-get install foo - // where foo is marked automatic - if (State.Install() == false && - (State.Flags & pkgCache::Flag::Auto) && - _config->FindB("APT::Get::ReInstall",false) == false && - _config->FindB("APT::Get::Only-Upgrade",false) == false && - _config->FindB("APT::Get::Download-Only",false) == false) - { - ioprintf(c1out,_("%s set to manually installed.\n"), - Pkg.FullName(true).c_str()); - Cache->GetDepCache()->MarkAuto(Pkg,false); - AutoMarkChanged++; - } - } - - bool propergateReleaseCandiateSwitching(std::list > start, std::ostream &out) - { - for (std::list >::const_iterator s = start.begin(); - s != start.end(); ++s) - Cache->GetDepCache()->SetCandidateVersion(s->first); - - bool Success = true; - // the Changed list contains: - // first: "new version" - // second: "what-caused the change" - std::list > Changed; - for (std::list >::const_iterator s = start.begin(); - s != start.end(); ++s) - { - Changed.push_back(std::make_pair(s->first, pkgCache::VerIterator(*Cache))); - // We continue here even if it failed to enhance the ShowBroken output - Success &= Cache->GetDepCache()->SetCandidateRelease(s->first, s->second, Changed); - } - for (std::list >::const_iterator c = Changed.begin(); - c != Changed.end(); ++c) - { - if (c->second.end() == true) - ioprintf(out, _("Selected version '%s' (%s) for '%s'\n"), - c->first.VerStr(), c->first.RelStr().c_str(), c->first.ParentPkg().FullName(true).c_str()); - else if (c->first.ParentPkg()->Group != c->second.ParentPkg()->Group) - { - pkgCache::VerIterator V = (*Cache)[c->first.ParentPkg()].CandidateVerIter(*Cache); - ioprintf(out, _("Selected version '%s' (%s) for '%s' because of '%s'\n"), V.VerStr(), - V.RelStr().c_str(), V.ParentPkg().FullName(true).c_str(), c->second.ParentPkg().FullName(true).c_str()); - } - } - return Success; - } - - void doAutoInstall() { - for (APT::PackageSet::const_iterator P = doAutoInstallLater.begin(); - P != doAutoInstallLater.end(); ++P) { - pkgDepCache::StateCache &State = (*Cache)[P]; - if (State.InstBroken() == false && State.InstPolicyBroken() == false) - continue; - Cache->GetDepCache()->MarkInstall(P, true); - } - doAutoInstallLater.clear(); - } + void operator() (pkgCache::VerIterator const &Ver); + bool propergateReleaseCandiateSwitching(std::list > const &start, std::ostream &out); + void doAutoInstall(); }; /*}}}*/ // TryToRemove - Mark a package for removal /*{{{*/ -struct TryToRemove { +struct APT_PUBLIC TryToRemove { pkgCacheFile* Cache; pkgProblemResolver* Fix; bool PurgePkgs; @@ -159,43 +53,7 @@ struct TryToRemove { TryToRemove(pkgCacheFile &Cache, pkgProblemResolver *PM) : Cache(&Cache), Fix(PM), PurgePkgs(_config->FindB("APT::Get::Purge", false)) {}; - void operator() (pkgCache::VerIterator const &Ver) - { - pkgCache::PkgIterator Pkg = Ver.ParentPkg(); - - if (Fix != NULL) - { - Fix->Clear(Pkg); - Fix->Protect(Pkg); - Fix->Remove(Pkg); - } - - if ((Pkg->CurrentVer == 0 && PurgePkgs == false) || - (PurgePkgs == true && Pkg->CurrentState == pkgCache::State::NotInstalled)) - { - pkgCache::GrpIterator Grp = Pkg.Group(); - pkgCache::PkgIterator P = Grp.PackageList(); - for (; P.end() != true; P = Grp.NextPkg(P)) - { - if (P == Pkg) - continue; - if (P->CurrentVer != 0 || (PurgePkgs == true && P->CurrentState != pkgCache::State::NotInstalled)) - { - // TRANSLATORS: Note, this is not an interactive question - ioprintf(c1out,_("Package '%s' is not installed, so not removed. Did you mean '%s'?\n"), - Pkg.FullName(true).c_str(), P.FullName(true).c_str()); - break; - } - } - if (P.end() == true) - ioprintf(c1out,_("Package '%s' is not installed, so not removed\n"),Pkg.FullName(true).c_str()); - - // MarkInstall refuses to install packages on hold - Pkg->SelectedState = pkgCache::State::Hold; - } - else - Cache->GetDepCache()->MarkDelete(Pkg, PurgePkgs); - } + void operator() (pkgCache::VerIterator const &Ver); }; /*}}}*/ diff --git a/apt-private/private-list.cc b/apt-private/private-list.cc index b053cbcbe..b69002103 100644 --- a/apt-private/private-list.cc +++ b/apt-private/private-list.cc @@ -130,10 +130,11 @@ bool DoList(CommandLine &Cmd) Cache->Head().PackageCount, _("Listing")); GetLocalitySortedVersionSet(CacheFile, bag, matcher, progress); + bool ShowAllVersions = _config->FindB("APT::Cmd::All-Versions", false); for (LocalitySortedVersionSet::iterator V = bag.begin(); V != bag.end(); ++V) { std::stringstream outs; - if(_config->FindB("APT::Cmd::All-Versions", false) == true) + if(ShowAllVersions == true) { ListAllVersions(CacheFile, records, V.ParentPkg(), outs, includeSummary); output_map.insert(std::make_pair( @@ -151,6 +152,18 @@ bool DoList(CommandLine &Cmd) std::cout << (*K).second << std::endl; + // be nice and tell the user if there is more to see + if (bag.size() == 1 && ShowAllVersions == false) + { + // start with -1 as we already displayed one version + int versions = -1; + pkgCache::VerIterator Ver = *bag.begin(); + for ( ; Ver.end() == false; Ver++) + versions++; + if (versions > 0) + _error->Notice(P_("There is %i additional version. Please use the '-a' switch to see it", "There are %i additional versions. Please use the '-a' switch to see them.", versions), versions); + } + return true; } diff --git a/apt-private/private-main.cc b/apt-private/private-main.cc index 2d3965172..668b1733a 100644 --- a/apt-private/private-main.cc +++ b/apt-private/private-main.cc @@ -8,9 +8,18 @@ #include #include #include +#include #include + +void InitSignals() +{ + // Setup the signals + signal(SIGPIPE,SIG_IGN); +} + + void CheckSimulateMode(CommandLine &CmdL) { // simulate user-friendly if apt-get has no root privileges diff --git a/apt-private/private-main.h b/apt-private/private-main.h index 23d4aca68..a03bf4441 100644 --- a/apt-private/private-main.h +++ b/apt-private/private-main.h @@ -6,5 +6,6 @@ class CommandLine; APT_PUBLIC void CheckSimulateMode(CommandLine &CmdL); +APT_PUBLIC void InitSignals(); #endif diff --git a/apt-private/private-output.cc b/apt-private/private-output.cc index bbd8545ad..32f27a3dd 100644 --- a/apt-private/private-output.cc +++ b/apt-private/private-output.cc @@ -22,6 +22,8 @@ #include #include #include +#include +#include #include /*}}}*/ @@ -32,8 +34,24 @@ std::ostream c0out(0); std::ostream c1out(0); std::ostream c2out(0); std::ofstream devnull("/dev/null"); + + unsigned int ScreenWidth = 80 - 1; /* - 1 for the cursor */ +// SigWinch - Window size change signal handler /*{{{*/ +// --------------------------------------------------------------------- +/* */ +static void SigWinch(int) +{ + // Riped from GNU ls +#ifdef TIOCGWINSZ + struct winsize ws; + + if (ioctl(1, TIOCGWINSZ, &ws) != -1 && ws.ws_col >= 5) + ScreenWidth = ws.ws_col - 1; +#endif +} + /*}}}*/ bool InitOutput() /*{{{*/ { if (!isatty(STDOUT_FILENO) && _config->FindI("quiet", -1) == -1) @@ -47,6 +65,10 @@ bool InitOutput() /*{{{*/ if (_config->FindI("quiet",0) > 1) c1out.rdbuf(devnull.rdbuf()); + // deal with window size changes + signal(SIGWINCH,SigWinch); + SigWinch(0); + if(!isatty(1)) { _config->Set("APT::Color", "false"); @@ -146,6 +168,10 @@ static std::string GetArchitecture(pkgCacheFile &CacheFile, pkgCache::PkgIterato pkgCache::VerIterator inst = P.CurrentVer(); pkgCache::VerIterator cand = policy->GetCandidateVer(P); + // this may happen for packages in dpkg "deinstall ok config-file" state + if (inst.IsGood() == false && cand.IsGood() == false) + return P.VersionList().Arch(); + return inst ? inst.Arch() : cand.Arch(); } /*}}}*/ diff --git a/buildlib/library.mak b/buildlib/library.mak index ef1306b66..cc0286d7e 100644 --- a/buildlib/library.mak +++ b/buildlib/library.mak @@ -63,7 +63,7 @@ $(LIB)/lib$(LIBRARY).so.$(MAJOR).$(MINOR): $($(LOCAL)-HEADERS) $($(LOCAL)-OBJS) vpath %.cc $(SUBDIRS) $(OBJ)/%.opic: %.cc $(LIBRARYDEPENDS) echo Compiling $< to $@ - $(CXX) -c $(INLINEDEPFLAG) $(CPPFLAGS) $(CXXFLAGS) $(PICFLAGS) -o $@ $< + $(CXX) -c $(INLINEDEPFLAG) $(CPPFLAGS) $(CXXFLAGS) $(PICFLAGS) -o $@ $(abspath $<) $(DoDep) # Include the dependencies that are available diff --git a/buildlib/program.mak b/buildlib/program.mak index da538f5eb..71c265f39 100644 --- a/buildlib/program.mak +++ b/buildlib/program.mak @@ -50,7 +50,7 @@ $($(LOCAL)-BIN): $($(LOCAL)-OBJS) $($(LOCAL)-MKS) vpath %.cc $(SUBDIRS) $(OBJ)/%.o: %.cc echo Compiling $< to $@ - $(CXX) -c $(INLINEDEPFLAG) $(CPPFLAGS) $(CXXFLAGS) -o $@ $< + $(CXX) -c $(INLINEDEPFLAG) $(CPPFLAGS) $(CXXFLAGS) -o $@ $(abspath $<) $(DoDep) # Include the dependencies that are available diff --git a/buildlib/python.mak b/buildlib/python.mak index 02345c2d2..f08ab5563 100644 --- a/buildlib/python.mak +++ b/buildlib/python.mak @@ -58,7 +58,7 @@ endif # ifdef PYTHONLIB vpath %.cc $(SUBDIRS) $(OBJ)/%.opic: %.cc echo Compiling $< to $@ - $(CXX) -c $(INLINEDEPFLAG) $(CPPFLAGS) $(CXXFLAGS) $(PICFLAGS) -o $@ $< + $(CXX) -c $(INLINEDEPFLAG) $(CPPFLAGS) $(CXXFLAGS) $(PICFLAGS) -o $@ $(abspath $<) $(DoDep) # Include the dependencies that are available diff --git a/buildlib/staticlibrary.mak b/buildlib/staticlibrary.mak index ce9259dc0..86908700f 100644 --- a/buildlib/staticlibrary.mak +++ b/buildlib/staticlibrary.mak @@ -50,7 +50,7 @@ endif vpath %.cc $(SUBDIRS) $(OBJ)/%.o: %.cc echo Compiling $< to $@ - $(CXX) -c $(INLINEDEPFLAG) $(CPPFLAGS) $(CXXFLAGS) -o $@ $< + $(CXX) -c $(INLINEDEPFLAG) $(CPPFLAGS) $(CXXFLAGS) -o $@ $(abspath $<) $(DoDep) # Include the dependencies that are available diff --git a/cmdline/acqprogress.cc b/cmdline/acqprogress.cc deleted file mode 100644 index c362c1edf..000000000 --- a/cmdline/acqprogress.cc +++ /dev/null @@ -1,306 +0,0 @@ -// -*- mode: cpp; mode: fold -*- -// Description /*{{{*/ -// $Id: acqprogress.cc,v 1.24 2003/04/27 01:56:48 doogie Exp $ -/* ###################################################################### - - Acquire Progress - Command line progress meter - - ##################################################################### */ - /*}}}*/ -// Include files /*{{{*/ -#include - -#include -#include -#include -#include -#include -#include - -#include -#include -#include -#include -#include - -#include "acqprogress.h" -#include - /*}}}*/ - -using namespace std; - -// AcqTextStatus::AcqTextStatus - Constructor /*{{{*/ -// --------------------------------------------------------------------- -/* */ -AcqTextStatus::AcqTextStatus(unsigned int &ScreenWidth,unsigned int Quiet) : - ScreenWidth(ScreenWidth), ID(0), Quiet(Quiet) -{ - BlankLine[0] = 0; -} - /*}}}*/ -// AcqTextStatus::Start - Downloading has started /*{{{*/ -// --------------------------------------------------------------------- -/* */ -void AcqTextStatus::Start() -{ - pkgAcquireStatus::Start(); - BlankLine[0] = 0; - ID = 1; -}; - /*}}}*/ -// AcqTextStatus::IMSHit - Called when an item got a HIT response /*{{{*/ -// --------------------------------------------------------------------- -/* */ -void AcqTextStatus::IMSHit(pkgAcquire::ItemDesc &Itm) -{ - if (Quiet > 1) - return; - - if (Quiet <= 0) - cout << '\r' << BlankLine << '\r'; - - cout << _("Hit ") << Itm.Description; - if (Itm.Owner->FileSize != 0) - cout << " [" << SizeToStr(Itm.Owner->FileSize) << "B]"; - cout << endl; - Update = true; -}; - /*}}}*/ -// AcqTextStatus::Fetch - An item has started to download /*{{{*/ -// --------------------------------------------------------------------- -/* This prints out the short description and the expected size */ -void AcqTextStatus::Fetch(pkgAcquire::ItemDesc &Itm) -{ - Update = true; - if (Itm.Owner->Complete == true) - return; - - Itm.Owner->ID = ID++; - - if (Quiet > 1) - return; - - if (Quiet <= 0) - cout << '\r' << BlankLine << '\r'; - - cout << _("Get:") << Itm.Owner->ID << ' ' << Itm.Description; - if (Itm.Owner->FileSize != 0) - cout << " [" << SizeToStr(Itm.Owner->FileSize) << "B]"; - cout << endl; -}; - /*}}}*/ -// AcqTextStatus::Done - Completed a download /*{{{*/ -// --------------------------------------------------------------------- -/* We don't display anything... */ -void AcqTextStatus::Done(pkgAcquire::ItemDesc &Itm) -{ - Update = true; -}; - /*}}}*/ -// AcqTextStatus::Fail - Called when an item fails to download /*{{{*/ -// --------------------------------------------------------------------- -/* We print out the error text */ -void AcqTextStatus::Fail(pkgAcquire::ItemDesc &Itm) -{ - if (Quiet > 1) - return; - - // Ignore certain kinds of transient failures (bad code) - if (Itm.Owner->Status == pkgAcquire::Item::StatIdle) - return; - - if (Quiet <= 0) - cout << '\r' << BlankLine << '\r'; - - if (Itm.Owner->Status == pkgAcquire::Item::StatDone) - { - cout << _("Ign ") << Itm.Description << endl; - } - else - { - cout << _("Err ") << Itm.Description << endl; - cout << " " << Itm.Owner->ErrorText << endl; - } - - Update = true; -}; - /*}}}*/ -// AcqTextStatus::Stop - Finished downloading /*{{{*/ -// --------------------------------------------------------------------- -/* This prints out the bytes downloaded and the overall average line - speed */ -void AcqTextStatus::Stop() -{ - pkgAcquireStatus::Stop(); - if (Quiet > 1) - return; - - if (Quiet <= 0) - cout << '\r' << BlankLine << '\r' << flush; - - if (FetchedBytes != 0 && _error->PendingError() == false) - ioprintf(cout,_("Fetched %sB in %s (%sB/s)\n"), - SizeToStr(FetchedBytes).c_str(), - TimeToStr(ElapsedTime).c_str(), - SizeToStr(CurrentCPS).c_str()); -} - /*}}}*/ -// AcqTextStatus::Pulse - Regular event pulse /*{{{*/ -// --------------------------------------------------------------------- -/* This draws the current progress. Each line has an overall percent - meter and a per active item status meter along with an overall - bandwidth and ETA indicator. */ -bool AcqTextStatus::Pulse(pkgAcquire *Owner) -{ - pkgAcquireStatus::Pulse(Owner); - - if (Quiet > 0) - return true; - - enum {Long = 0,Medium,Short} Mode = Medium; - - char Buffer[sizeof(BlankLine)]; - char *End = Buffer + sizeof(Buffer); - char *S = Buffer; - if (ScreenWidth >= sizeof(Buffer)) - ScreenWidth = sizeof(Buffer)-1; - - // Put in the percent done - sprintf(S,"%.0f%%",((CurrentBytes + CurrentItems)*100.0)/(TotalBytes+TotalItems)); - - bool Shown = false; - for (pkgAcquire::Worker *I = Owner->WorkersBegin(); I != 0; - I = Owner->WorkerStep(I)) - { - S += strlen(S); - - // There is no item running - if (I->CurrentItem == 0) - { - if (I->Status.empty() == false) - { - snprintf(S,End-S," [%s]",I->Status.c_str()); - Shown = true; - } - - continue; - } - - Shown = true; - - // Add in the short description - if (I->CurrentItem->Owner->ID != 0) - snprintf(S,End-S," [%lu %s",I->CurrentItem->Owner->ID, - I->CurrentItem->ShortDesc.c_str()); - else - snprintf(S,End-S," [%s",I->CurrentItem->ShortDesc.c_str()); - S += strlen(S); - - // Show the short mode string - if (I->CurrentItem->Owner->Mode != 0) - { - snprintf(S,End-S," %s",I->CurrentItem->Owner->Mode); - S += strlen(S); - } - - // Add the current progress - if (Mode == Long) - snprintf(S,End-S," %llu",I->CurrentSize); - else - { - if (Mode == Medium || I->TotalSize == 0) - snprintf(S,End-S," %sB",SizeToStr(I->CurrentSize).c_str()); - } - S += strlen(S); - - // Add the total size and percent - if (I->TotalSize > 0 && I->CurrentItem->Owner->Complete == false) - { - if (Mode == Short) - snprintf(S,End-S," %.0f%%", - (I->CurrentSize*100.0)/I->TotalSize); - else - snprintf(S,End-S,"/%sB %.0f%%",SizeToStr(I->TotalSize).c_str(), - (I->CurrentSize*100.0)/I->TotalSize); - } - S += strlen(S); - snprintf(S,End-S,"]"); - } - - // Show something.. - if (Shown == false) - snprintf(S,End-S,_(" [Working]")); - - /* Put in the ETA and cps meter, block off signals to prevent strangeness - during resizing */ - sigset_t Sigs,OldSigs; - sigemptyset(&Sigs); - sigaddset(&Sigs,SIGWINCH); - sigprocmask(SIG_BLOCK,&Sigs,&OldSigs); - - if (CurrentCPS != 0) - { - char Tmp[300]; - unsigned long long ETA = (TotalBytes - CurrentBytes)/CurrentCPS; - sprintf(Tmp," %sB/s %s",SizeToStr(CurrentCPS).c_str(),TimeToStr(ETA).c_str()); - unsigned int Len = strlen(Buffer); - unsigned int LenT = strlen(Tmp); - if (Len + LenT < ScreenWidth) - { - memset(Buffer + Len,' ',ScreenWidth - Len); - strcpy(Buffer + ScreenWidth - LenT,Tmp); - } - } - Buffer[ScreenWidth] = 0; - BlankLine[ScreenWidth] = 0; - sigprocmask(SIG_SETMASK,&OldSigs,0); - - // Draw the current status - if (strlen(Buffer) == strlen(BlankLine)) - cout << '\r' << Buffer << flush; - else - cout << '\r' << BlankLine << '\r' << Buffer << flush; - memset(BlankLine,' ',strlen(Buffer)); - BlankLine[strlen(Buffer)] = 0; - - Update = false; - - return true; -} - /*}}}*/ -// AcqTextStatus::MediaChange - Media need to be swapped /*{{{*/ -// --------------------------------------------------------------------- -/* Prompt for a media swap */ -bool AcqTextStatus::MediaChange(string Media,string Drive) -{ - // If we do not output on a terminal and one of the options to avoid user - // interaction is given, we assume that no user is present who could react - // on your media change request - if (isatty(STDOUT_FILENO) != 1 && Quiet >= 2 && - (_config->FindB("APT::Get::Assume-Yes",false) == true || - _config->FindB("APT::Get::Force-Yes",false) == true || - _config->FindB("APT::Get::Trivial-Only",false) == true)) - - return false; - - if (Quiet <= 0) - cout << '\r' << BlankLine << '\r'; - ioprintf(cout,_("Media change: please insert the disc labeled\n" - " '%s'\n" - "in the drive '%s' and press enter\n"), - Media.c_str(),Drive.c_str()); - - char C = 0; - bool bStatus = true; - while (C != '\n' && C != '\r') - { - int len = read(STDIN_FILENO,&C,1); - if(C == 'c' || len <= 0) - bStatus = false; - } - - if(bStatus) - Update = true; - return bStatus; -} - /*}}}*/ diff --git a/cmdline/acqprogress.h b/cmdline/acqprogress.h deleted file mode 100644 index 8f0903923..000000000 --- a/cmdline/acqprogress.h +++ /dev/null @@ -1,39 +0,0 @@ -// -*- mode: cpp; mode: fold -*- -// Description /*{{{*/ -// $Id: acqprogress.h,v 1.5 2003/02/02 22:24:11 jgg Exp $ -/* ###################################################################### - - Acquire Progress - Command line progress meter - - ##################################################################### */ - /*}}}*/ -#ifndef ACQPROGRESS_H -#define ACQPROGRESS_H - -#include - -#include - -class AcqTextStatus : public pkgAcquireStatus -{ - unsigned int &ScreenWidth; - char BlankLine[1024]; - unsigned long ID; - unsigned long Quiet; - - public: - - virtual bool MediaChange(std::string Media,std::string Drive); - virtual void IMSHit(pkgAcquire::ItemDesc &Itm); - virtual void Fetch(pkgAcquire::ItemDesc &Itm); - virtual void Done(pkgAcquire::ItemDesc &Itm); - virtual void Fail(pkgAcquire::ItemDesc &Itm); - virtual void Start(); - virtual void Stop(); - - bool Pulse(pkgAcquire *Owner); - - AcqTextStatus(unsigned int &ScreenWidth,unsigned int Quiet); -}; - -#endif diff --git a/cmdline/apt-cache.cc b/cmdline/apt-cache.cc index 84b775390..1414617eb 100644 --- a/cmdline/apt-cache.cc +++ b/cmdline/apt-cache.cc @@ -507,7 +507,7 @@ static bool DumpAvail(CommandLine &) break; } - FileFd PkgF(File.FileName(),FileFd::ReadOnly); + FileFd PkgF(File.FileName(),FileFd::ReadOnly, FileFd::Extension); if (_error->PendingError() == true) break; diff --git a/cmdline/apt-get.cc b/cmdline/apt-get.cc index caf69da2a..a58386eb0 100644 --- a/cmdline/apt-get.cc +++ b/cmdline/apt-get.cc @@ -756,6 +756,7 @@ static bool DoSource(CommandLine &CmdL) // Load the requestd sources into the fetcher unsigned J = 0; + std::string UntrustedList; for (const char **I = CmdL.FileList + 1; *I != 0; I++, J++) { string Src; @@ -764,6 +765,9 @@ static bool DoSource(CommandLine &CmdL) if (Last == 0) { return _error->Error(_("Unable to find a source package for %s"),Src.c_str()); } + + if (Last->Index().IsTrusted() == false) + UntrustedList += Src + " "; string srec = Last->AsStr(); string::size_type pos = srec.find("\nVcs-"); @@ -847,6 +851,10 @@ static bool DoSource(CommandLine &CmdL) Last->Index().SourceInfo(*Last,*I),Src); } } + + // check authentication status of the source as well + if (UntrustedList != "" && !AuthPrompt(UntrustedList, false)) + return false; // Display statistics unsigned long long FetchBytes = Fetcher.FetchNeeded(); @@ -1664,20 +1672,6 @@ static bool ShowHelp(CommandLine &) "pages for more information and options.\n" " This APT has Super Cow Powers.\n"); return true; -} - /*}}}*/ -// SigWinch - Window size change signal handler /*{{{*/ -// --------------------------------------------------------------------- -/* */ -static void SigWinch(int) -{ - // Riped from GNU ls -#ifdef TIOCGWINSZ - struct winsize ws; - - if (ioctl(1, TIOCGWINSZ, &ws) != -1 && ws.ws_col >= 5) - ScreenWidth = ws.ws_col - 1; -#endif } /*}}}*/ int main(int argc,const char *argv[]) /*{{{*/ @@ -1734,14 +1728,12 @@ int main(int argc,const char *argv[]) /*{{{*/ // see if we are in simulate mode CheckSimulateMode(CmdL); + // Init the signals + InitSignals(); + // Setup the output streams InitOutput(); - // Setup the signals - signal(SIGPIPE,SIG_IGN); - signal(SIGWINCH,SigWinch); - SigWinch(0); - // Match the operation CmdL.DispatchArg(Cmds); diff --git a/cmdline/apt-helper.cc b/cmdline/apt-helper.cc index 2c1107d90..b0edafcbd 100644 --- a/cmdline/apt-helper.cc +++ b/cmdline/apt-helper.cc @@ -43,7 +43,8 @@ static bool DoDownloadFile(CommandLine &CmdL) std::string hash; if (CmdL.FileSize() > 3) hash = CmdL.FileList[3]; - new pkgAcqFile(&Fetcher, download_uri, hash, 0, "desc", "short-desc", + // we use download_uri as descr and targetfile as short-descr + new pkgAcqFile(&Fetcher, download_uri, hash, 0, download_uri, targetfile, "dest-dir-ignored", targetfile); Fetcher.Run(); bool Failed = false; diff --git a/cmdline/apt-internal-solver.cc b/cmdline/apt-internal-solver.cc index b85c07c33..e4cdf6381 100644 --- a/cmdline/apt-internal-solver.cc +++ b/cmdline/apt-internal-solver.cc @@ -31,6 +31,7 @@ #include #include #include +#include #include /*}}}*/ @@ -56,6 +57,12 @@ static bool ShowHelp(CommandLine &) { return true; } /*}}}*/ +APT_NORETURN static void DIE(std::string const &message) { /*{{{*/ + std::cerr << "ERROR: " << message << std::endl; + _error->DumpErrors(std::cerr); + exit(EXIT_FAILURE); +} + /*}}}*/ int main(int argc,const char *argv[]) /*{{{*/ { CommandLine::Args Args[] = { @@ -115,34 +122,29 @@ int main(int argc,const char *argv[]) /*{{{*/ EDSP::WriteProgress(0, "Start up solver…", output); - if (pkgInitSystem(*_config,_system) == false) { - std::cerr << "System could not be initialized!" << std::endl; - return 1; - } + if (pkgInitSystem(*_config,_system) == false) + DIE("System could not be initialized!"); EDSP::WriteProgress(1, "Read request…", output); if (WaitFd(input, false, 5) == false) - std::cerr << "WAIT timed out in the resolver" << std::endl; + DIE("WAIT timed out in the resolver"); std::list install, remove; bool upgrade, distUpgrade, autoRemove; - if (EDSP::ReadRequest(input, install, remove, upgrade, distUpgrade, autoRemove) == false) { - std::cerr << "Parsing the request failed!" << std::endl; - return 2; - } + if (EDSP::ReadRequest(input, install, remove, upgrade, distUpgrade, autoRemove) == false) + DIE("Parsing the request failed!"); EDSP::WriteProgress(5, "Read scenario…", output); pkgCacheFile CacheFile; - CacheFile.Open(NULL, false); + if (CacheFile.Open(NULL, false) == false) + DIE("Failed to open CacheFile!"); EDSP::WriteProgress(50, "Apply request on scenario…", output); - if (EDSP::ApplyRequest(install, remove, CacheFile) == false) { - std::cerr << "Failed to apply request to depcache!" << std::endl; - return 3; - } + if (EDSP::ApplyRequest(install, remove, CacheFile) == false) + DIE("Failed to apply request to depcache!"); pkgProblemResolver Fix(CacheFile); for (std::list::const_iterator i = remove.begin(); @@ -183,10 +185,8 @@ int main(int argc,const char *argv[]) /*{{{*/ EDSP::WriteProgress(95, "Write solution…", output); - if (EDSP::WriteSolution(CacheFile, output) == false) { - std::cerr << "Failed to output the solution!" << std::endl; - return 4; - } + if (EDSP::WriteSolution(CacheFile, output) == false) + DIE("Failed to output the solution!"); EDSP::WriteProgress(100, "Done", output); diff --git a/cmdline/apt.cc b/cmdline/apt.cc index 8a6f96aea..2cfdf8e8e 100644 --- a/cmdline/apt.cc +++ b/cmdline/apt.cc @@ -96,6 +96,10 @@ int main(int argc, const char *argv[]) /*{{{*/ std::vector Args = getCommandArgs("apt", CommandLine::GetCommand(Cmds, argc, argv)); + // Init the signals + InitSignals(); + + // Init the output InitOutput(); // Set up gettext support diff --git a/configure.ac b/configure.ac index e135f8d5e..a16055652 100644 --- a/configure.ac +++ b/configure.ac @@ -18,7 +18,7 @@ AC_CONFIG_AUX_DIR(buildlib) AC_CONFIG_HEADER(include/config.h:buildlib/config.h.in include/apti18n.h:buildlib/apti18n.h.in) PACKAGE="apt" -PACKAGE_VERSION="1.0" +PACKAGE_VERSION="1.0.3" PACKAGE_MAIL="APT Development Team " AC_DEFINE_UNQUOTED(PACKAGE,"$PACKAGE") AC_DEFINE_UNQUOTED(PACKAGE_VERSION,"$PACKAGE_VERSION") diff --git a/debian/apt-doc.docs b/debian/apt-doc.docs index 4ec23f55d..df9701123 100644 --- a/debian/apt-doc.docs +++ b/debian/apt-doc.docs @@ -1,3 +1,2 @@ README.progress-reporting -README.MultiArch doc/external-dependency-solver-protocol.txt diff --git a/debian/apt.conf.autoremove b/debian/apt.conf.autoremove index cf69d56c3..fc02350ae 100644 --- a/debian/apt.conf.autoremove +++ b/debian/apt.conf.autoremove @@ -22,6 +22,8 @@ APT ".*-modules"; ".*-kernel"; "linux-backports-modules-.*"; + # tools + "linux-tools"; }; Never-MarkAuto-Sections diff --git a/debian/apt.install.in b/debian/apt.install.in index 4ffe43e3b..9c9489572 100644 --- a/debian/apt.install.in +++ b/debian/apt.install.in @@ -3,4 +3,4 @@ bin/apt-* usr/bin/ bin/methods/* usr/lib/apt/methods/ scripts/dselect/* usr/lib/dpkg/methods/apt/ usr/share/locale/*/*/apt.mo -bin/libapt-private.so.* usr/lib/@DEB_HOST_MULTIARCH@/ \ No newline at end of file +bin/libapt-private.so.* usr/lib/@DEB_HOST_MULTIARCH@/ diff --git a/debian/apt.maintscript b/debian/apt.maintscript new file mode 100644 index 000000000..939769355 --- /dev/null +++ b/debian/apt.maintscript @@ -0,0 +1,3 @@ +rm_conffile /etc/apt/apt.conf.d/20changelog 1.0.3 -- "@" + + diff --git a/debian/changelog b/debian/changelog index d3beac0ef..57fa52f29 100644 --- a/debian/changelog +++ b/debian/changelog @@ -1,3 +1,86 @@ +apt (1.0.3) unstable; urgency=medium + + [ Michael Vogt ] + * reduce delta to ubuntu + * provide support for vendor specific config files + * debian/apt-doc.docs: remove README.MultiArch + * Fix missing ScreenWidth check in apt.cc + * Only do openpty() if both stdin/stdout are terminals (Closes: 746434) + + [ David Kalnischkies ] + * add a README for vendor information + * remove outdated README.MultiArch + * build http request in a stringstream + * enforce LFS for partial files in https range requests + * handle pkgnames shorter than modifiers (Closes: 744940) + * allow vendors to install configuration files + + [ John Ogness ] + * properly undo CD-ROM mount in all error cases + + [ Mahyuddin Ramli ] + * add vendor information for BlankOn (Closes: 743595) + + [ Adam Conrad ] + * fix FileFd::Size bitswap on big-endian architectures (Closes: 745866) + + [ Trần Ngọc Quân ] + * l10n: vi.po: Update one new string + + -- Michael Vogt Mon, 05 May 2014 14:03:15 +0200 + +apt (1.0.2) unstable; urgency=medium + + [ Michael Vogt ] + * fix apt list output for pkgs in dpkg ^rc state + * Notice the user about "apt list -a" when only a single hit if found + * fix test-failure in adt + * apt-private/acqprogress.cc: fix output when ctrl-c is hit during + apt update (LP: #1310548, closes: #744297) + * Fix option name DPkg::Progress-Fancy in apt.8 manpage + (LP: #1310506) + + [ David Kalnischkies ] + * don't double-count seeks in FileFd::Skip for bzip/xz + * deal with umask only if we really need to for mkstemp + * consider priorities only for downloadable pkgs in resolver + * force fancy progressbar redraw on window size change + * clear HitEof flag in FileFd::Seek + * use Google C++ Testing Framework for libapt tests + * support dist-upgrade options in full-upgrade + + [ Trần Ngọc Quân ] + * l10n: vi.po (624t): Update translation + + [ Theppitak Karoonboonyanan ] + * Updated Thai program translation (closes: #745120) + + [ James McCoy ] + * Consistently use Dpkg::Progress* in documentation (Closes: 745452) + + -- Michael Vogt Fri, 25 Apr 2014 13:15:03 +0200 + +apt (1.0.1) unstable; urgency=medium + + [ Michael Vogt ] + * Fix crash in "apt list" when a sources.list file is unreable + (Closes: 743413) + * make apt search case-insensitive by default + * Fix possible race when stunnel/aptwebserver create their PID files + in the tests + * Fix insecure file permissions when using FileFd with OpenMode::Atomic + (LP: #1304657) + + [ Julian Andres Klode ] + * Version the Breaks/Replaces for sun-java{5,6}-jdk (LP: #1302736) + (Closes: #743616) + * Add versioned openjdk-6-jdk breaks + + [ Josef Vitu ] + * apt: Minor typo in 'apt' man page (closes: #743657) + + -- Michael Vogt Thu, 10 Apr 2014 09:48:56 +0200 + apt (1.0) unstable; urgency=low The "Happy birthday and 10000b years in the making" release diff --git a/debian/control b/debian/control index c0526d30c..ff984db75 100644 --- a/debian/control +++ b/debian/control @@ -9,7 +9,7 @@ Build-Depends: dpkg-dev (>= 1.15.8), debhelper (>= 8.1.3~), libdb-dev, gettext (>= 0.12), libcurl4-gnutls-dev (>= 7.19.4~), zlib1g-dev, libbz2-dev, liblzma-dev, xsltproc, docbook-xsl, docbook-xml, po4a (>= 0.34-2), - autotools-dev, autoconf, automake + autotools-dev, autoconf, automake, libgtest-dev Build-Depends-Indep: doxygen, debiandoc-sgml, graphviz Build-Conflicts: autoconf2.13, automake1.4 Vcs-Git: git://anonscm.debian.org/apt/apt.git diff --git a/debian/libapt-inst1.5.symbols b/debian/libapt-inst1.5.symbols index 35cce919f..8ce707287 100644 --- a/debian/libapt-inst1.5.symbols +++ b/debian/libapt-inst1.5.symbols @@ -88,4 +88,4 @@ libapt-inst.so.1.5 libapt-inst1.5 #MINVER# (c++|regex|optional=std)"^std::basic_string<.+ >\(.+\)@Base$" 0.8.0 (c++|regex|optional=std)"^typeinfo name for std::iterator<.*>@Base$" 0.8.0 (c++|regex|optional=std)"^typeinfo for std::iterator<.*>@Base$" 0.8.0 -### + (c++|optional=std)"std::ctype::do_widen(char) const@Base" 1.0.3 diff --git a/debian/libapt-pkg4.12.symbols b/debian/libapt-pkg4.12.symbols index 5d7b21f10..3fa128cff 100644 --- a/debian/libapt-pkg4.12.symbols +++ b/debian/libapt-pkg4.12.symbols @@ -181,7 +181,6 @@ libapt-pkg.so.4.12 libapt-pkg4.12 #MINVER# (c++)"pkgRecords::Parser::ShortDesc()@Base" 0.8.0 (c++)"pkgRecords::Parser::SourcePkg()@Base" 0.8.0 (c++)"pkgRecords::Parser::SourceVer()@Base" 0.8.0 - (c++)"pkgRecords::Parser::~Parser()@Base" 0.8.0 (c++)"pkgRecords::pkgRecords(pkgCache&)@Base" 0.8.0 (c++)"pkgRecords::~pkgRecords()@Base" 0.8.0 (c++)"pkgTagFile::Step(pkgTagSection&)@Base" 0.8.0 @@ -315,8 +314,6 @@ libapt-pkg.so.4.12 libapt-pkg4.12 #MINVER# (c++)"pkgIndexFile::Type::GlobalListLen@Base" 0.8.0 (c++)"pkgIndexFile::Type::GetType(char const*)@Base" 0.8.0 (c++)"pkgIndexFile::Type::Type()@Base" 0.8.0 - (c++)"pkgIndexFile::Type::~Type()@Base" 0.8.0 - (c++)"pkgIndexFile::~pkgIndexFile()@Base" 0.8.0 (c++)"pkgOrderList::VisitRDeps(bool (pkgOrderList::*)(pkgCache::DepIterator), pkgCache::PkgIterator)@Base" 0.8.0 (c++)"pkgOrderList::OrderUnpack(std::basic_string, std::allocator >*)@Base" 0.8.0 (c++)"pkgOrderList::DepConfigure(pkgCache::DepIterator)@Base" 0.8.0 @@ -402,7 +399,6 @@ libapt-pkg.so.4.12 libapt-pkg4.12 #MINVER# (c++)"pkgSourceList::Type::GlobalListLen@Base" 0.8.0 (c++)"pkgSourceList::Type::GetType(char const*)@Base" 0.8.0 (c++)"pkgSourceList::Type::Type()@Base" 0.8.0 - (c++)"pkgSourceList::Type::~Type()@Base" 0.8.0 (c++)"pkgSourceList::Reset()@Base" 0.8.0 (c++)"pkgSourceList::pkgSourceList(std::basic_string, std::allocator >)@Base" 0.8.0 (c++)"pkgSourceList::pkgSourceList()@Base" 0.8.0 @@ -411,7 +407,6 @@ libapt-pkg.so.4.12 libapt-pkg4.12 #MINVER# (c++)"pkgSrcRecords::Find(char const*, bool const&)@Base" 0.8.0 (c++)"pkgSrcRecords::Parser::BuildDepRec::~BuildDepRec()@Base" 0.8.0 (c++)"pkgSrcRecords::Parser::BuildDepType(unsigned char const&)@Base" 0.8.0 - (c++)"pkgSrcRecords::Parser::~Parser()@Base" 0.8.0 (c++)"pkgSrcRecords::Restart()@Base" 0.8.0 (c++)"pkgSrcRecords::pkgSrcRecords(pkgSourceList&)@Base" 0.8.0 (c++)"pkgSrcRecords::~pkgSrcRecords()@Base" 0.8.0 @@ -457,7 +452,6 @@ libapt-pkg.so.4.12 libapt-pkg4.12 #MINVER# (c++)"debReleaseIndex::debReleaseIndex(std::basic_string, std::allocator > const&, std::basic_string, std::allocator > const&)@Base" 0.8.0 (c++)"debReleaseIndex::~debReleaseIndex()@Base" 0.8.0 (c++)"debSLTypeDebSrc::~debSLTypeDebSrc()@Base" 0.8.0 - (c++)"debSLTypeDebian::~debSLTypeDebian()@Base" 0.8.0 (c++)"debSourcesIndex::debSourcesIndex(std::basic_string, std::allocator >, std::basic_string, std::allocator >, std::basic_string, std::allocator >, bool)@Base" 0.8.0 (c++)"debSourcesIndex::~debSourcesIndex()@Base" 0.8.0 (c++)"pkgAcqDiffIndex::ParseDiffIndex(std::basic_string, std::allocator >)@Base" 0.8.0 @@ -503,13 +497,11 @@ libapt-pkg.so.4.12 libapt-pkg4.12 #MINVER# (c++)"pkgAcquireStatus::Start()@Base" 0.8.0 (c++)"pkgAcquireStatus::IMSHit(pkgAcquire::ItemDesc&)@Base" 0.8.0 (c++)"pkgAcquireStatus::pkgAcquireStatus()@Base" 0.8.0 - (c++)"pkgAcquireStatus::~pkgAcquireStatus()@Base" 0.8.0 (c++)"PreferenceSection::TrimRecord(bool, char const*&)@Base" 0.8.0 (c++)"pkgArchiveCleaner::Go(std::basic_string, std::allocator >, pkgCache&)@Base" 0.8.0 (c++)"pkgCacheGenerator::ListParser::NewDepends(pkgCache::VerIterator&, std::basic_string, std::allocator > const&, std::basic_string, std::allocator > const&, std::basic_string, std::allocator > const&, unsigned int, unsigned int)@Base" 0.8.0 (c++)"pkgCacheGenerator::ListParser::NewProvides(pkgCache::VerIterator&, std::basic_string, std::allocator > const&, std::basic_string, std::allocator > const&, std::basic_string, std::allocator > const&)@Base" 0.8.0 (c++)"pkgCacheGenerator::ListParser::CollectFileProvides(pkgCache&, pkgCache::VerIterator&)@Base" 0.8.0 - (c++)"pkgCacheGenerator::ListParser::~ListParser()@Base" 0.8.0 (c++)"pkgCacheGenerator::NewFileVer(pkgCache::VerIterator&, pkgCacheGenerator::ListParser&)@Base" 0.8.0 (c++)"pkgCacheGenerator::NewPackage(pkgCache::PkgIterator&, std::basic_string, std::allocator > const&, std::basic_string, std::allocator > const&)@Base" 0.8.0 (c++)"pkgCacheGenerator::SelectFile(std::basic_string, std::allocator > const&, std::basic_string, std::allocator > const&, pkgIndexFile const&, unsigned long)@Base" 0.8.0 @@ -586,7 +578,6 @@ libapt-pkg.so.4.12 libapt-pkg4.12 #MINVER# (c++)"pkgVersioningSystem::TestCompatibility(pkgVersioningSystem const&)@Base" 0.8.0 (c++)"pkgVersioningSystem::GetVS(char const*)@Base" 0.8.0 (c++)"pkgVersioningSystem::pkgVersioningSystem()@Base" 0.8.0 - (c++)"pkgVersioningSystem::~pkgVersioningSystem()@Base" 0.8.0 (c++)"debTranslationsIndex::debTranslationsIndex(std::basic_string, std::allocator >, std::basic_string, std::allocator >, std::basic_string, std::allocator >, char const*)@Base" 0.8.0 (c++)"debTranslationsIndex::~debTranslationsIndex()@Base" 0.8.0 (c++)"APT::CacheFilter::PackageNameMatchesRegEx::PackageNameMatchesRegEx(std::basic_string, std::allocator > const&)@Base" 0.8.0 @@ -671,7 +662,6 @@ libapt-pkg.so.4.12 libapt-pkg4.12 #MINVER# (c++)"IndexCopy::ConvertToSourceList(std::basic_string, std::allocator >, std::basic_string, std::allocator >&)@Base" 0.8.0 (c++)"IndexCopy::ChopDirs(std::basic_string, std::allocator >, unsigned int)@Base" 0.8.0 (c++)"IndexCopy::GrabFirst(std::basic_string, std::allocator >, std::basic_string, std::allocator >&, unsigned int)@Base" 0.8.0 - (c++)"IndexCopy::~IndexCopy()@Base" 0.8.0 (c++)"SigVerify::CopyAndVerify(std::basic_string, std::allocator >, std::basic_string, std::allocator >, std::vector, std::allocator >, std::allocator, std::allocator > > >&, std::vector, std::allocator >, std::allocator, std::allocator > > >, std::vector, std::allocator >, std::allocator, std::allocator > > >)@Base" 0.8.0 (c++)"SigVerify::CopyMetaIndex(std::basic_string, std::allocator >, std::basic_string, std::allocator >, std::basic_string, std::allocator >, std::basic_string, std::allocator >)@Base" 0.8.0 (c++)"SigVerify::Verify(std::basic_string, std::allocator >, std::basic_string, std::allocator >, indexRecords*)@Base" 0.8.0 @@ -685,7 +675,6 @@ libapt-pkg.so.4.12 libapt-pkg4.12 #MINVER# (c++)"debSystem::UnLock(bool)@Base" 0.8.0 (c++)"debSystem::debSystem()@Base" 0.8.0 (c++)"debSystem::~debSystem()@Base" 0.8.0 - (c++)"metaIndex::~metaIndex()@Base" 0.8.0 (c++)"pkgDPkgPM::SendV2Pkgs(_IO_FILE*)@Base" 0.8.0 (c++)"pkgDPkgPM::DoTerminalPty(int)@Base" 0.8.0 (c++)"pkgDPkgPM::WriteHistoryTag(std::basic_string, std::allocator > const&, std::basic_string, std::allocator >)@Base" 0.8.0 @@ -716,7 +705,6 @@ libapt-pkg.so.4.12 libapt-pkg4.12 #MINVER# (c++)"pkgSystem::Score(Configuration const&)@Base" 0.8.0 (c++)"pkgSystem::GetSystem(char const*)@Base" 0.8.0 (c++)"pkgSystem::pkgSystem()@Base" 0.8.0 - (c++)"pkgSystem::~pkgSystem()@Base" 0.8.0 (c++)"HashString::VerifyFile(std::basic_string, std::allocator >) const@Base" 0.8.0 (c++)"HashString::empty() const@Base" 0.8.0 (c++)"HashString::toStr() const@Base" 0.8.0 @@ -1448,7 +1436,6 @@ libapt-pkg.so.4.12 libapt-pkg4.12 #MINVER# ### rework of the packagemanager rework (c++)"APT::Progress::PackageManager::ConffilePrompt(std::basic_string, std::allocator >, unsigned int, unsigned int, std::basic_string, std::allocator >)@Base" 0.9.13~exp1 (c++)"APT::Progress::PackageManager::Error(std::basic_string, std::allocator >, unsigned int, unsigned int, std::basic_string, std::allocator >)@Base" 0.9.13~exp1 - (c++)"APT::Progress::PackageManagerFancy::GetNumberTerminalRows()@Base" 0.9.13~exp1 (c++)"APT::Progress::PackageManagerFancy::HandleSIGWINCH(int)@Base" 0.9.13~exp1 (c++)"APT::Progress::PackageManagerFancy::~PackageManagerFancy()@Base" 0.9.13~exp1 (c++)"APT::Progress::PackageManagerFancy::PackageManagerFancy()@Base" 0.9.13~exp1 @@ -1579,13 +1566,40 @@ libapt-pkg.so.4.12 libapt-pkg4.12 #MINVER# (c++)"debListParser::ParseDepends(char const*, char const*, std::basic_string, std::allocator >&, std::basic_string, std::allocator >&, unsigned int&, bool const&, bool const&, bool const&)@Base" 0.9.16 (c++)"pkgCacheGenerator::ListParser::SameVersion(unsigned short, pkgCache::VerIterator const&)@Base" 0.9.16 (c++)"Rename(std::basic_string, std::allocator >, std::basic_string, std::allocator >)@Base" 0.9.16 + (c++)"pkgDepCache::IsInstallOkDependenciesSatisfiableByCandidates(pkgCache::PkgIterator const&, bool, unsigned long, bool)@Base" 1.0 + (c++)"APT::Progress::PackageManagerFancy::GetTerminalSize()@Base" 1.0 + (c++)"APT::Progress::PackageManagerFancy::GetTextProgressStr(float, int)@Base" 1.0 + (c++)"pkgCdromStatus::GetOpProgress()@Base" 1.0 + (c++)"pkgCdromStatus::SetTotal(int)@Base" 1.0 + (c++)"EDSP::ExecuteSolver(char const*, int*, int*, bool)@Base" 1.0.4 + (c++)"pkgPackageManager::EarlyRemove(pkgCache::PkgIterator, pkgCache::DepIterator const*)@Base" 1.0.4 + (c++)"debTranslationsParser::Architecture()@Base" 1.0.4 + (c++)"debTranslationsParser::~debTranslationsParser()@Base" 1.0.4 + (c++)"debTranslationsParser::Version()@Base" 1.0.4 + (c++)"typeinfo for debTranslationsParser@Base" 1.0.4 + (c++)"typeinfo name for debTranslationsParser@Base" 1.0.4 + (c++)"vtable for debTranslationsParser@Base" 1.0.4 ### demangle strangeness - buildd report it as MISSING and as new… (c++)"pkgAcqMetaSig::pkgAcqMetaSig(pkgAcquire*, std::basic_string, std::allocator >, std::basic_string, std::allocator >, std::basic_string, std::allocator >, std::basic_string, std::allocator >, std::basic_string, std::allocator >, std::basic_string, std::allocator >, std::vector > const*, indexRecords*)@Base" 0.8.0 ### gcc-4.6 artefacts - (c++|optional=implicit)"HashString::operator=(HashString const&)@Base" 0.8.0 - (c++|optional=implicit)"HashString::HashString(HashString const&)@Base" 0.8.0 - (c++|optional=inline)"APT::VersionContainer > >::iterator std::max_element > >::iterator, CompareProviders>(APT::VersionContainer > >::iterator, APT::VersionContainer > >::iterator, CompareProviders)@Base" 0.8.0 - (c++|optional=inline)"pkgCache::VerIterator::ParentPkg() const@Base" 0.8.0 +# (c++|optional=implicit)"HashString::operator=(HashString const&)@Base" 0.8.0 +# (c++|optional=implicit)"HashString::HashString(HashString const&)@Base" 0.8.0 +# (c++|optional=inline)"APT::VersionContainer > >::iterator std::max_element > >::iterator, CompareProviders>(APT::VersionContainer > >::iterator, APT::VersionContainer > >::iterator, CompareProviders)@Base" 0.8.0 +# (c++|optional=inline)"pkgCache::VerIterator::ParentPkg() const@Base" 0.8.0 +### gcc-4.8 artefacts +# (c++|optional=implicit)"debSLTypeDebian::~debSLTypeDebian()@Base" 0.8.0 +### empty destructors included in the .h file +# (c++|optional=inline)"pkgVersioningSystem::~pkgVersioningSystem()@Base" 0.8.0 +# (c++|optional=inline)"pkgSystem::~pkgSystem()@Base" 0.8.0 +# (c++|optional=inline)"pkgRecords::Parser::~Parser()@Base" 0.8.0 +# (c++|optional=inline)"pkgSrcRecords::Parser::~Parser()@Base" 0.8.0 +# (c++|optional=inline)"pkgIndexFile::Type::~Type()@Base" 0.8.0 +# (c++|optional=inline)"pkgSourceList::Type::~Type()@Base" 0.8.0 +# (c++|optional=inline)"pkgIndexFile::~pkgIndexFile()@Base" 0.8.0 +# (c++|optional=inline)"pkgCacheGenerator::ListParser::~ListParser()@Base" 0.8.0 +# (c++|optional=inline)"pkgAcquireStatus::~pkgAcquireStatus()@Base" 0.8.0 +# (c++|optional=inline)"metaIndex::~metaIndex()@Base" 0.8.0 +# (c++|optional=inline)"IndexCopy::~IndexCopy()@Base" 0.8.0 ### std library artefacts (c++|regex|optional=std)"^std::vector::(vector|push_back|erase|_[^ ]+)\(.+\)( const|)@Base$" 0.8.0 @@ -1593,15 +1607,13 @@ libapt-pkg.so.4.12 libapt-pkg4.12 #MINVER# (c++|optional=std)"char* std::basic_string, std::allocator >::_S_construct<__gnu_cxx::__normal_iterator, std::allocator > > >(__gnu_cxx::__normal_iterator, std::allocator > >, __gnu_cxx::__normal_iterator, std::allocator > >, std::allocator const&, std::forward_iterator_tag)@Base" 0.8.0 (c++|optional=std)"char* std::basic_string, std::allocator >::_S_construct(char const*, char const*, std::allocator const&, std::forward_iterator_tag)@Base" 0.8.0 (c++|optional=std)"char* std::basic_string, std::allocator >::_S_construct(char*, char*, std::allocator const&, std::forward_iterator_tag)@Base" 0.8.0 - (c++|optional=std)"std::basic_string, std::allocator >& std::basic_string, std::allocator >::_M_replace_dispatch(__gnu_cxx::__normal_iterator, std::allocator > >, __gnu_cxx::__normal_iterator, std::allocator > >, unsigned char*, unsigned char*, std::__false_type)@Base" 0.8.0 ### try to ignore std:: template instances (c++|regex|optional=std)"^(void |)std::[^ ]+<.+ >::(_|~).+\(.*\)@Base$" 0.8.0 (c++|regex|optional=std)"^std::[^ ]+<.+ >::(append|insert|reserve|operator[^ ]+)\(.*\)@Base$" 0.8.0 (c++|regex|optional=std)"^(void |DiffInfo\* |)std::_.*@Base$" 0.8.0 - (c++|regex|optional=std)"^(bool|void) std::(operator|sort_heap|make_heap)[^ ]+<.+ >\(.+\)@Base$" 0.8.0 (c++|regex|optional=std)"^std::reverse_iterator<.+ > std::__.+@Base$" 0.8.0 (c++|regex|optional=std)"^std::basic_string<.+ >\(.+\)@Base$" 0.8.0 - (c++|regex|optional=std)"^std::basic_string<.+ >::basic_string<.+>\(.+\)@Base$" 0.8.0 (c++|regex|optional=std)"^__gnu_cxx::__[^ ]+<.*@Base$" 0.8.0 (c++|regex|optional=std)"^typeinfo name for std::iterator<.*>@Base$" 0.8.0 (c++|regex|optional=std)"^typeinfo for std::iterator<.*>@Base$" 0.8.0 + (c++|optional=std)"std::ctype::do_widen(char) const@Base" 1.0.3 diff --git a/debian/rules b/debian/rules index eec016607..f8b392986 100755 --- a/debian/rules +++ b/debian/rules @@ -189,6 +189,8 @@ apt: build-binary build-manpages debian/apt.install cp debian/apt.conf.autoremove debian/$@/etc/apt/apt.conf.d/01autoremove cp debian/apt.auto-removal.sh debian/$@/etc/kernel/postinst.d/apt-auto-removal chmod 755 debian/$@/etc/kernel/postinst.d/apt-auto-removal + # install vendor specific apt confs + find -L vendor/current -name 'apt.conf-*' | while read conf; do cp "$${conf}" "debian/$@/etc/apt/apt.conf.d/$${conf#*-}"; done # make rosetta happy and remove pot files in po/ (but leave stuff # in po/domains/* untouched) and cp *.po into each domain dir diff --git a/doc/Doxyfile.in b/doc/Doxyfile.in index ffd7c88b9..6ca8d1189 100644 --- a/doc/Doxyfile.in +++ b/doc/Doxyfile.in @@ -1,112 +1,129 @@ -# Doxyfile 1.8.4 +# Doxyfile 1.8.7 # This file describes the settings to be used by the documentation system # doxygen (www.doxygen.org) for a project. # -# All text after a double hash (##) is considered a comment and is placed -# in front of the TAG it is preceding . -# All text after a hash (#) is considered a comment and will be ignored. +# All text after a double hash (##) is considered a comment and is placed in +# front of the TAG it is preceding. +# +# All text after a single hash (#) is considered a comment and will be ignored. # The format is: -# TAG = value [value, ...] -# For lists items can also be appended using: -# TAG += value [value, ...] -# Values that contain spaces should be placed between quotes (" "). +# TAG = value [value, ...] +# For lists, items can also be appended using: +# TAG += value [value, ...] +# Values that contain spaces should be placed between quotes (\" \"). #--------------------------------------------------------------------------- # Project related configuration options #--------------------------------------------------------------------------- # This tag specifies the encoding used for all characters in the config file -# that follow. The default is UTF-8 which is also the encoding used for all -# text before the first occurrence of this tag. Doxygen uses libiconv (or the -# iconv built into libc) for the transcoding. See -# http://www.gnu.org/software/libiconv for the list of possible encodings. +# that follow. The default is UTF-8 which is also the encoding used for all text +# before the first occurrence of this tag. Doxygen uses libiconv (or the iconv +# built into libc) for the transcoding. See http://www.gnu.org/software/libiconv +# for the list of possible encodings. +# The default value is: UTF-8. DOXYFILE_ENCODING = UTF-8 -# The PROJECT_NAME tag is a single word (or sequence of words) that should -# identify the project. Note that if you do not use Doxywizard you need -# to put quotes around the project name if it contains spaces. +# The PROJECT_NAME tag is a single word (or a sequence of words surrounded by +# double-quotes, unless you are using Doxywizard) that should identify the +# project for which the documentation is generated. This name is used in the +# title of most generated pages and in a few other places. +# The default value is: My Project. PROJECT_NAME = @PACKAGE@ -# The PROJECT_NUMBER tag can be used to enter a project or revision number. -# This could be handy for archiving the generated documentation or -# if some version control system is used. +# The PROJECT_NUMBER tag can be used to enter a project or revision number. This +# could be handy for archiving the generated documentation or if some version +# control system is used. PROJECT_NUMBER = @PACKAGE_VERSION@ # Using the PROJECT_BRIEF tag one can provide an optional one line description -# for a project that appears at the top of each page and should give viewer -# a quick idea about the purpose of the project. Keep the description short. +# for a project that appears at the top of each page and should give viewer a +# quick idea about the purpose of the project. Keep the description short. PROJECT_BRIEF = "commandline package manager" -# With the PROJECT_LOGO tag one can specify an logo or icon that is -# included in the documentation. The maximum height of the logo should not -# exceed 55 pixels and the maximum width should not exceed 200 pixels. -# Doxygen will copy the logo to the output directory. +# With the PROJECT_LOGO tag one can specify an logo or icon that is included in +# the documentation. The maximum height of the logo should not exceed 55 pixels +# and the maximum width should not exceed 200 pixels. Doxygen will copy the logo +# to the output directory. PROJECT_LOGO = -# The OUTPUT_DIRECTORY tag is used to specify the (relative or absolute) -# base path where the generated documentation will be put. -# If a relative path is entered, it will be relative to the location -# where doxygen was started. If left blank the current directory will be used. +# The OUTPUT_DIRECTORY tag is used to specify the (relative or absolute) path +# into which the generated documentation will be written. If a relative path is +# entered, it will be relative to the location where doxygen was started. If +# left blank the current directory will be used. OUTPUT_DIRECTORY = ../build/doc/doxygen -# If the CREATE_SUBDIRS tag is set to YES, then doxygen will create -# 4096 sub-directories (in 2 levels) under the output directory of each output -# format and will distribute the generated files over these directories. -# Enabling this option can be useful when feeding doxygen a huge amount of -# source files, where putting all generated files in the same directory would -# otherwise cause performance problems for the file system. +# If the CREATE_SUBDIRS tag is set to YES, then doxygen will create 4096 sub- +# directories (in 2 levels) under the output directory of each output format and +# will distribute the generated files over these directories. Enabling this +# option can be useful when feeding doxygen a huge amount of source files, where +# putting all generated files in the same directory would otherwise causes +# performance problems for the file system. +# The default value is: NO. CREATE_SUBDIRS = NO +# If the ALLOW_UNICODE_NAMES tag is set to YES, doxygen will allow non-ASCII +# characters to appear in the names of generated files. If set to NO, non-ASCII +# characters will be escaped, for example _xE3_x81_x84 will be used for Unicode +# U+3044. +# The default value is: NO. + +ALLOW_UNICODE_NAMES = NO + # The OUTPUT_LANGUAGE tag is used to specify the language in which all # documentation generated by doxygen is written. Doxygen will use this # information to generate all constant output in the proper language. -# The default language is English, other supported languages are: -# Afrikaans, Arabic, Brazilian, Catalan, Chinese, Chinese-Traditional, -# Croatian, Czech, Danish, Dutch, Esperanto, Farsi, Finnish, French, German, -# Greek, Hungarian, Italian, Japanese, Japanese-en (Japanese with English -# messages), Korean, Korean-en, Latvian, Lithuanian, Norwegian, Macedonian, -# Persian, Polish, Portuguese, Romanian, Russian, Serbian, Serbian-Cyrillic, -# Slovak, Slovene, Spanish, Swedish, Ukrainian, and Vietnamese. +# Possible values are: Afrikaans, Arabic, Armenian, Brazilian, Catalan, Chinese, +# Chinese-Traditional, Croatian, Czech, Danish, Dutch, English (United States), +# Esperanto, Farsi (Persian), Finnish, French, German, Greek, Hungarian, +# Indonesian, Italian, Japanese, Japanese-en (Japanese with English messages), +# Korean, Korean-en (Korean with English messages), Latvian, Lithuanian, +# Macedonian, Norwegian, Persian (Farsi), Polish, Portuguese, Romanian, Russian, +# Serbian, Serbian-Cyrillic, Slovak, Slovene, Spanish, Swedish, Turkish, +# Ukrainian and Vietnamese. +# The default value is: English. OUTPUT_LANGUAGE = English -# If the BRIEF_MEMBER_DESC tag is set to YES (the default) Doxygen will -# include brief member descriptions after the members that are listed in -# the file and class documentation (similar to JavaDoc). -# Set to NO to disable this. +# If the BRIEF_MEMBER_DESC tag is set to YES doxygen will include brief member +# descriptions after the members that are listed in the file and class +# documentation (similar to Javadoc). Set to NO to disable this. +# The default value is: YES. BRIEF_MEMBER_DESC = YES -# If the REPEAT_BRIEF tag is set to YES (the default) Doxygen will prepend -# the brief description of a member or function before the detailed description. -# Note: if both HIDE_UNDOC_MEMBERS and BRIEF_MEMBER_DESC are set to NO, the +# If the REPEAT_BRIEF tag is set to YES doxygen will prepend the brief +# description of a member or function before the detailed description +# +# Note: If both HIDE_UNDOC_MEMBERS and BRIEF_MEMBER_DESC are set to NO, the # brief descriptions will be completely suppressed. +# The default value is: YES. REPEAT_BRIEF = YES -# This tag implements a quasi-intelligent brief description abbreviator -# that is used to form the text in various listings. Each string -# in this list, if found as the leading text of the brief description, will be -# stripped from the text and the result after processing the whole list, is -# used as the annotated text. Otherwise, the brief description is used as-is. -# If left blank, the following values are used ("$name" is automatically -# replaced with the name of the entity): "The $name class" "The $name widget" -# "The $name file" "is" "provides" "specifies" "contains" -# "represents" "a" "an" "the" +# This tag implements a quasi-intelligent brief description abbreviator that is +# used to form the text in various listings. Each string in this list, if found +# as the leading text of the brief description, will be stripped from the text +# and the result, after processing the whole list, is used as the annotated +# text. Otherwise, the brief description is used as-is. If left blank, the +# following values are used ($name is automatically replaced with the name of +# the entity):The $name class, The $name widget, The $name file, is, provides, +# specifies, contains, represents, a, an and the. ABBREVIATE_BRIEF = # If the ALWAYS_DETAILED_SEC and REPEAT_BRIEF tags are both set to YES then -# Doxygen will generate a detailed section even if there is only a brief +# doxygen will generate a detailed section even if there is only a brief # description. +# The default value is: NO. ALWAYS_DETAILED_SEC = NO @@ -114,143 +131,165 @@ ALWAYS_DETAILED_SEC = NO # inherited members of a class in the documentation of that class as if those # members were ordinary class members. Constructors, destructors and assignment # operators of the base classes will not be shown. +# The default value is: NO. INLINE_INHERITED_MEMB = NO -# If the FULL_PATH_NAMES tag is set to YES then Doxygen will prepend the full -# path before files name in the file list and in the header files. If set -# to NO the shortest path that makes the file name unique will be used. +# If the FULL_PATH_NAMES tag is set to YES doxygen will prepend the full path +# before files name in the file list and in the header files. If set to NO the +# shortest path that makes the file name unique will be used +# The default value is: YES. FULL_PATH_NAMES = YES -# If the FULL_PATH_NAMES tag is set to YES then the STRIP_FROM_PATH tag -# can be used to strip a user-defined part of the path. Stripping is -# only done if one of the specified strings matches the left-hand part of -# the path. The tag can be used to show relative paths in the file list. -# If left blank the directory from which doxygen is run is used as the -# path to strip. Note that you specify absolute paths here, but also -# relative paths, which will be relative from the directory where doxygen is -# started. +# The STRIP_FROM_PATH tag can be used to strip a user-defined part of the path. +# Stripping is only done if one of the specified strings matches the left-hand +# part of the path. The tag can be used to show relative paths in the file list. +# If left blank the directory from which doxygen is run is used as the path to +# strip. +# +# Note that you can specify absolute paths here, but also relative paths, which +# will be relative from the directory where doxygen is started. +# This tag requires that the tag FULL_PATH_NAMES is set to YES. STRIP_FROM_PATH = -# The STRIP_FROM_INC_PATH tag can be used to strip a user-defined part of -# the path mentioned in the documentation of a class, which tells -# the reader which header file to include in order to use a class. -# If left blank only the name of the header file containing the class -# definition is used. Otherwise one should specify the include paths that -# are normally passed to the compiler using the -I flag. +# The STRIP_FROM_INC_PATH tag can be used to strip a user-defined part of the +# path mentioned in the documentation of a class, which tells the reader which +# header file to include in order to use a class. If left blank only the name of +# the header file containing the class definition is used. Otherwise one should +# specify the list of include paths that are normally passed to the compiler +# using the -I flag. STRIP_FROM_INC_PATH = -# If the SHORT_NAMES tag is set to YES, doxygen will generate much shorter -# (but less readable) file names. This can be useful if your file system -# doesn't support long names like on DOS, Mac, or CD-ROM. +# If the SHORT_NAMES tag is set to YES, doxygen will generate much shorter (but +# less readable) file names. This can be useful is your file systems doesn't +# support long names like on DOS, Mac, or CD-ROM. +# The default value is: NO. SHORT_NAMES = NO -# If the JAVADOC_AUTOBRIEF tag is set to YES then Doxygen -# will interpret the first line (until the first dot) of a JavaDoc-style -# comment as the brief description. If set to NO, the JavaDoc -# comments will behave just like regular Qt-style comments -# (thus requiring an explicit @brief command for a brief description.) +# If the JAVADOC_AUTOBRIEF tag is set to YES then doxygen will interpret the +# first line (until the first dot) of a Javadoc-style comment as the brief +# description. If set to NO, the Javadoc-style will behave just like regular Qt- +# style comments (thus requiring an explicit @brief command for a brief +# description.) +# The default value is: NO. JAVADOC_AUTOBRIEF = NO -# If the QT_AUTOBRIEF tag is set to YES then Doxygen will -# interpret the first line (until the first dot) of a Qt-style -# comment as the brief description. If set to NO, the comments -# will behave just like regular Qt-style comments (thus requiring -# an explicit \brief command for a brief description.) +# If the QT_AUTOBRIEF tag is set to YES then doxygen will interpret the first +# line (until the first dot) of a Qt-style comment as the brief description. If +# set to NO, the Qt-style will behave just like regular Qt-style comments (thus +# requiring an explicit \brief command for a brief description.) +# The default value is: NO. QT_AUTOBRIEF = NO -# The MULTILINE_CPP_IS_BRIEF tag can be set to YES to make Doxygen -# treat a multi-line C++ special comment block (i.e. a block of //! or /// -# comments) as a brief description. This used to be the default behaviour. -# The new default is to treat a multi-line C++ comment block as a detailed -# description. Set this tag to YES if you prefer the old behaviour instead. +# The MULTILINE_CPP_IS_BRIEF tag can be set to YES to make doxygen treat a +# multi-line C++ special comment block (i.e. a block of //! or /// comments) as +# a brief description. This used to be the default behavior. The new default is +# to treat a multi-line C++ comment block as a detailed description. Set this +# tag to YES if you prefer the old behavior instead. +# +# Note that setting this tag to YES also means that rational rose comments are +# not recognized any more. +# The default value is: NO. MULTILINE_CPP_IS_BRIEF = NO -# If the INHERIT_DOCS tag is set to YES (the default) then an undocumented -# member inherits the documentation from any documented member that it -# re-implements. +# If the INHERIT_DOCS tag is set to YES then an undocumented member inherits the +# documentation from any documented member that it re-implements. +# The default value is: YES. INHERIT_DOCS = YES -# If the SEPARATE_MEMBER_PAGES tag is set to YES, then doxygen will produce -# a new page for each member. If set to NO, the documentation of a member will -# be part of the file/class/namespace that contains it. +# If the SEPARATE_MEMBER_PAGES tag is set to YES, then doxygen will produce a +# new page for each member. If set to NO, the documentation of a member will be +# part of the file/class/namespace that contains it. +# The default value is: NO. SEPARATE_MEMBER_PAGES = NO -# The TAB_SIZE tag can be used to set the number of spaces in a tab. -# Doxygen uses this value to replace tabs by spaces in code fragments. +# The TAB_SIZE tag can be used to set the number of spaces in a tab. Doxygen +# uses this value to replace tabs by spaces in code fragments. +# Minimum value: 1, maximum value: 16, default value: 4. TAB_SIZE = 8 -# This tag can be used to specify a number of aliases that acts -# as commands in the documentation. An alias has the form "name=value". -# For example adding "sideeffect=\par Side Effects:\n" will allow you to -# put the command \sideeffect (or @sideeffect) in the documentation, which -# will result in a user-defined paragraph with heading "Side Effects:". -# You can put \n's in the value part of an alias to insert newlines. +# This tag can be used to specify a number of aliases that act as commands in +# the documentation. An alias has the form: +# name=value +# For example adding +# "sideeffect=@par Side Effects:\n" +# will allow you to put the command \sideeffect (or @sideeffect) in the +# documentation, which will result in a user-defined paragraph with heading +# "Side Effects:". You can put \n's in the value part of an alias to insert +# newlines. ALIASES = "TODO=\todo" # This tag can be used to specify a number of word-keyword mappings (TCL only). -# A mapping has the form "name=value". For example adding -# "class=itcl::class" will allow you to use the command class in the -# itcl::class meaning. +# A mapping has the form "name=value". For example adding "class=itcl::class" +# will allow you to use the command class in the itcl::class meaning. TCL_SUBST = -# Set the OPTIMIZE_OUTPUT_FOR_C tag to YES if your project consists of C -# sources only. Doxygen will then generate output that is more tailored for C. -# For instance, some of the names that are used will be different. The list -# of all members will be omitted, etc. +# Set the OPTIMIZE_OUTPUT_FOR_C tag to YES if your project consists of C sources +# only. Doxygen will then generate output that is more tailored for C. For +# instance, some of the names that are used will be different. The list of all +# members will be omitted, etc. +# The default value is: NO. OPTIMIZE_OUTPUT_FOR_C = NO -# Set the OPTIMIZE_OUTPUT_JAVA tag to YES if your project consists of Java -# sources only. Doxygen will then generate output that is more tailored for -# Java. For instance, namespaces will be presented as packages, qualified -# scopes will look different, etc. +# Set the OPTIMIZE_OUTPUT_JAVA tag to YES if your project consists of Java or +# Python sources only. Doxygen will then generate output that is more tailored +# for that language. For instance, namespaces will be presented as packages, +# qualified scopes will look different, etc. +# The default value is: NO. OPTIMIZE_OUTPUT_JAVA = NO # Set the OPTIMIZE_FOR_FORTRAN tag to YES if your project consists of Fortran -# sources only. Doxygen will then generate output that is more tailored for -# Fortran. +# sources. Doxygen will then generate output that is tailored for Fortran. +# The default value is: NO. OPTIMIZE_FOR_FORTRAN = NO # Set the OPTIMIZE_OUTPUT_VHDL tag to YES if your project consists of VHDL -# sources. Doxygen will then generate output that is tailored for -# VHDL. +# sources. Doxygen will then generate output that is tailored for VHDL. +# The default value is: NO. OPTIMIZE_OUTPUT_VHDL = NO # Doxygen selects the parser to use depending on the extension of the files it # parses. With this tag you can assign which parser to use for a given # extension. Doxygen has a built-in mapping, but you can override or extend it -# using this tag. The format is ext=language, where ext is a file extension, -# and language is one of the parsers supported by doxygen: IDL, Java, -# Javascript, CSharp, C, C++, D, PHP, Objective-C, Python, Fortran, VHDL, C, -# C++. For instance to make doxygen treat .inc files as Fortran files (default -# is PHP), and .f files as C (default is Fortran), use: inc=Fortran f=C. Note -# that for custom extensions you also need to set FILE_PATTERNS otherwise the -# files are not read by doxygen. +# using this tag. The format is ext=language, where ext is a file extension, and +# language is one of the parsers supported by doxygen: IDL, Java, Javascript, +# C#, C, C++, D, PHP, Objective-C, Python, Fortran (fixed format Fortran: +# FortranFixed, free formatted Fortran: FortranFree, unknown formatted Fortran: +# Fortran. In the later case the parser tries to guess whether the code is fixed +# or free formatted code, this is the default for Fortran type files), VHDL. For +# instance to make doxygen treat .inc files as Fortran files (default is PHP), +# and .f files as C (default is Fortran), use: inc=Fortran f=C. +# +# Note For files without extension you can use no_extension as a placeholder. +# +# Note that for custom extensions you also need to set FILE_PATTERNS otherwise +# the files are not read by doxygen. EXTENSION_MAPPING = -# If MARKDOWN_SUPPORT is enabled (the default) then doxygen pre-processes all -# comments according to the Markdown format, which allows for more readable +# If the MARKDOWN_SUPPORT tag is enabled then doxygen pre-processes all comments +# according to the Markdown format, which allows for more readable # documentation. See http://daringfireball.net/projects/markdown/ for details. -# The output of markdown processing is further processed by doxygen, so you -# can mix doxygen, HTML, and XML commands with Markdown formatting. -# Disable only in case of backward compatibilities issues. +# The output of markdown processing is further processed by doxygen, so you can +# mix doxygen, HTML, and XML commands with Markdown formatting. Disable only in +# case of backward compatibilities issues. +# The default value is: YES. MARKDOWN_SUPPORT = YES @@ -258,35 +297,41 @@ MARKDOWN_SUPPORT = YES # classes, or namespaces to their corresponding documentation. Such a link can # be prevented in individual cases by by putting a % sign in front of the word # or globally by setting AUTOLINK_SUPPORT to NO. +# The default value is: YES. AUTOLINK_SUPPORT = YES # If you use STL classes (i.e. std::string, std::vector, etc.) but do not want -# to include (a tag file for) the STL sources as input, then you should -# set this tag to YES in order to let doxygen match functions declarations and -# definitions whose arguments contain STL classes (e.g. func(std::string); v.s. -# func(std::string) {}). This also makes the inheritance and collaboration +# to include (a tag file for) the STL sources as input, then you should set this +# tag to YES in order to let doxygen match functions declarations and +# definitions whose arguments contain STL classes (e.g. func(std::string); +# versus func(std::string) {}). This also make the inheritance and collaboration # diagrams that involve STL classes more complete and accurate. +# The default value is: NO. BUILTIN_STL_SUPPORT = YES # If you use Microsoft's C++/CLI language, you should set this option to YES to # enable parsing support. +# The default value is: NO. CPP_CLI_SUPPORT = NO -# Set the SIP_SUPPORT tag to YES if your project consists of sip sources only. -# Doxygen will parse them like normal C++ but will assume all classes use public -# instead of private inheritance when no explicit protection keyword is present. +# Set the SIP_SUPPORT tag to YES if your project consists of sip (see: +# http://www.riverbankcomputing.co.uk/software/sip/intro) sources only. Doxygen +# will parse them like normal C++ but will assume all classes use public instead +# of private inheritance when no explicit protection keyword is present. +# The default value is: NO. SIP_SUPPORT = NO # For Microsoft's IDL there are propget and propput attributes to indicate -# getter and setter methods for a property. Setting this option to YES (the -# default) will make doxygen replace the get and set methods by a property in -# the documentation. This will only work if the methods are indeed getting or -# setting a simple type. If this is not the case, or you want to show the -# methods anyway, you should set this option to NO. +# getter and setter methods for a property. Setting this option to YES will make +# doxygen to replace the get and set methods by a property in the documentation. +# This will only work if the methods are indeed getting or setting a simple +# type. If this is not the case, or you want to show the methods anyway, you +# should set this option to NO. +# The default value is: YES. IDL_PROPERTY_SUPPORT = YES @@ -294,51 +339,61 @@ IDL_PROPERTY_SUPPORT = YES # tag is set to YES, then doxygen will reuse the documentation of the first # member in the group (if any) for the other members of the group. By default # all members of a group must be documented explicitly. +# The default value is: NO. DISTRIBUTE_GROUP_DOC = NO -# Set the SUBGROUPING tag to YES (the default) to allow class member groups of -# the same type (for instance a group of public functions) to be put as a -# subgroup of that type (e.g. under the Public Functions section). Set it to -# NO to prevent subgrouping. Alternatively, this can be done per class using -# the \nosubgrouping command. +# Set the SUBGROUPING tag to YES to allow class member groups of the same type +# (for instance a group of public functions) to be put as a subgroup of that +# type (e.g. under the Public Functions section). Set it to NO to prevent +# subgrouping. Alternatively, this can be done per class using the +# \nosubgrouping command. +# The default value is: YES. SUBGROUPING = YES -# When the INLINE_GROUPED_CLASSES tag is set to YES, classes, structs and -# unions are shown inside the group in which they are included (e.g. using -# @ingroup) instead of on a separate page (for HTML and Man pages) or -# section (for LaTeX and RTF). +# When the INLINE_GROUPED_CLASSES tag is set to YES, classes, structs and unions +# are shown inside the group in which they are included (e.g. using \ingroup) +# instead of on a separate page (for HTML and Man pages) or section (for LaTeX +# and RTF). +# +# Note that this feature does not work in combination with +# SEPARATE_MEMBER_PAGES. +# The default value is: NO. INLINE_GROUPED_CLASSES = NO -# When the INLINE_SIMPLE_STRUCTS tag is set to YES, structs, classes, and -# unions with only public data fields or simple typedef fields will be shown -# inline in the documentation of the scope in which they are defined (i.e. file, +# When the INLINE_SIMPLE_STRUCTS tag is set to YES, structs, classes, and unions +# with only public data fields or simple typedef fields will be shown inline in +# the documentation of the scope in which they are defined (i.e. file, # namespace, or group documentation), provided this scope is documented. If set -# to NO (the default), structs, classes, and unions are shown on a separate -# page (for HTML and Man pages) or section (for LaTeX and RTF). +# to NO, structs, classes, and unions are shown on a separate page (for HTML and +# Man pages) or section (for LaTeX and RTF). +# The default value is: NO. INLINE_SIMPLE_STRUCTS = NO -# When TYPEDEF_HIDES_STRUCT is enabled, a typedef of a struct, union, or enum -# is documented as struct, union, or enum with the name of the typedef. So +# When TYPEDEF_HIDES_STRUCT tag is enabled, a typedef of a struct, union, or +# enum is documented as struct, union, or enum with the name of the typedef. So # typedef struct TypeS {} TypeT, will appear in the documentation as a struct # with name TypeT. When disabled the typedef will appear as a member of a file, -# namespace, or class. And the struct will be named TypeS. This can typically -# be useful for C code in case the coding convention dictates that all compound +# namespace, or class. And the struct will be named TypeS. This can typically be +# useful for C code in case the coding convention dictates that all compound # types are typedef'ed and only the typedef is referenced, never the tag name. +# The default value is: NO. TYPEDEF_HIDES_STRUCT = NO # The size of the symbol lookup cache can be set using LOOKUP_CACHE_SIZE. This -# cache is used to resolve symbols given their name and scope. Since this can -# be an expensive process and often the same symbol appear multiple times in -# the code, doxygen keeps a cache of pre-resolved symbols. If the cache is too -# small doxygen will become slower. If the cache is too large, memory is wasted. -# The cache size is given by this formula: 2^(16+LOOKUP_CACHE_SIZE). The valid -# range is 0..9, the default is 0, corresponding to a cache size of 2^16 = 65536 -# symbols. +# cache is used to resolve symbols given their name and scope. Since this can be +# an expensive process and often the same symbol appears multiple times in the +# code, doxygen keeps a cache of pre-resolved symbols. If the cache is too small +# doxygen will become slower. If the cache is too large, memory is wasted. The +# cache size is given by this formula: 2^(16+LOOKUP_CACHE_SIZE). The valid range +# is 0..9, the default is 0, corresponding to a cache size of 2^16=65536 +# symbols. At the end of a run doxygen will report the cache usage and suggest +# the optimal cache size from a speed point of view. +# Minimum value: 0, maximum value: 9, default value: 0. LOOKUP_CACHE_SIZE = 0 @@ -347,343 +402,391 @@ LOOKUP_CACHE_SIZE = 0 #--------------------------------------------------------------------------- # If the EXTRACT_ALL tag is set to YES doxygen will assume all entities in -# documentation are documented, even if no documentation was available. -# Private class members and static file members will be hidden unless -# the EXTRACT_PRIVATE respectively EXTRACT_STATIC tags are set to YES +# documentation are documented, even if no documentation was available. Private +# class members and static file members will be hidden unless the +# EXTRACT_PRIVATE respectively EXTRACT_STATIC tags are set to YES. +# Note: This will also disable the warnings about undocumented members that are +# normally produced when WARNINGS is set to YES. +# The default value is: NO. EXTRACT_ALL = NO -# If the EXTRACT_PRIVATE tag is set to YES all private members of a class -# will be included in the documentation. +# If the EXTRACT_PRIVATE tag is set to YES all private members of a class will +# be included in the documentation. +# The default value is: NO. EXTRACT_PRIVATE = NO # If the EXTRACT_PACKAGE tag is set to YES all members with package or internal # scope will be included in the documentation. +# The default value is: NO. EXTRACT_PACKAGE = NO -# If the EXTRACT_STATIC tag is set to YES all static members of a file -# will be included in the documentation. +# If the EXTRACT_STATIC tag is set to YES all static members of a file will be +# included in the documentation. +# The default value is: NO. EXTRACT_STATIC = NO -# If the EXTRACT_LOCAL_CLASSES tag is set to YES classes (and structs) -# defined locally in source files will be included in the documentation. -# If set to NO only classes defined in header files are included. +# If the EXTRACT_LOCAL_CLASSES tag is set to YES classes (and structs) defined +# locally in source files will be included in the documentation. If set to NO +# only classes defined in header files are included. Does not have any effect +# for Java sources. +# The default value is: YES. EXTRACT_LOCAL_CLASSES = YES -# This flag is only useful for Objective-C code. When set to YES local -# methods, which are defined in the implementation section but not in -# the interface are included in the documentation. -# If set to NO (the default) only methods in the interface are included. +# This flag is only useful for Objective-C code. When set to YES local methods, +# which are defined in the implementation section but not in the interface are +# included in the documentation. If set to NO only methods in the interface are +# included. +# The default value is: NO. EXTRACT_LOCAL_METHODS = NO # If this flag is set to YES, the members of anonymous namespaces will be # extracted and appear in the documentation as a namespace called -# 'anonymous_namespace{file}', where file will be replaced with the base -# name of the file that contains the anonymous namespace. By default -# anonymous namespaces are hidden. +# 'anonymous_namespace{file}', where file will be replaced with the base name of +# the file that contains the anonymous namespace. By default anonymous namespace +# are hidden. +# The default value is: NO. EXTRACT_ANON_NSPACES = NO -# If the HIDE_UNDOC_MEMBERS tag is set to YES, Doxygen will hide all -# undocumented members of documented classes, files or namespaces. -# If set to NO (the default) these members will be included in the -# various overviews, but no documentation section is generated. -# This option has no effect if EXTRACT_ALL is enabled. +# If the HIDE_UNDOC_MEMBERS tag is set to YES, doxygen will hide all +# undocumented members inside documented classes or files. If set to NO these +# members will be included in the various overviews, but no documentation +# section is generated. This option has no effect if EXTRACT_ALL is enabled. +# The default value is: NO. HIDE_UNDOC_MEMBERS = NO -# If the HIDE_UNDOC_CLASSES tag is set to YES, Doxygen will hide all -# undocumented classes that are normally visible in the class hierarchy. -# If set to NO (the default) these classes will be included in the various -# overviews. This option has no effect if EXTRACT_ALL is enabled. +# If the HIDE_UNDOC_CLASSES tag is set to YES, doxygen will hide all +# undocumented classes that are normally visible in the class hierarchy. If set +# to NO these classes will be included in the various overviews. This option has +# no effect if EXTRACT_ALL is enabled. +# The default value is: NO. HIDE_UNDOC_CLASSES = NO -# If the HIDE_FRIEND_COMPOUNDS tag is set to YES, Doxygen will hide all -# friend (class|struct|union) declarations. -# If set to NO (the default) these declarations will be included in the -# documentation. +# If the HIDE_FRIEND_COMPOUNDS tag is set to YES, doxygen will hide all friend +# (class|struct|union) declarations. If set to NO these declarations will be +# included in the documentation. +# The default value is: NO. HIDE_FRIEND_COMPOUNDS = NO -# If the HIDE_IN_BODY_DOCS tag is set to YES, Doxygen will hide any -# documentation blocks found inside the body of a function. -# If set to NO (the default) these blocks will be appended to the -# function's detailed documentation block. +# If the HIDE_IN_BODY_DOCS tag is set to YES, doxygen will hide any +# documentation blocks found inside the body of a function. If set to NO these +# blocks will be appended to the function's detailed documentation block. +# The default value is: NO. HIDE_IN_BODY_DOCS = NO -# The INTERNAL_DOCS tag determines if documentation -# that is typed after a \internal command is included. If the tag is set -# to NO (the default) then the documentation will be excluded. -# Set it to YES to include the internal documentation. +# The INTERNAL_DOCS tag determines if documentation that is typed after a +# \internal command is included. If the tag is set to NO then the documentation +# will be excluded. Set it to YES to include the internal documentation. +# The default value is: NO. INTERNAL_DOCS = NO -# If the CASE_SENSE_NAMES tag is set to NO then Doxygen will only generate -# file names in lower-case letters. If set to YES upper-case letters are also +# If the CASE_SENSE_NAMES tag is set to NO then doxygen will only generate file +# names in lower-case letters. If set to YES upper-case letters are also # allowed. This is useful if you have classes or files whose names only differ # in case and if your file system supports case sensitive file names. Windows # and Mac users are advised to set this option to NO. +# The default value is: system dependent. CASE_SENSE_NAMES = YES -# If the HIDE_SCOPE_NAMES tag is set to NO (the default) then Doxygen -# will show members with their full class and namespace scopes in the -# documentation. If set to YES the scope will be hidden. +# If the HIDE_SCOPE_NAMES tag is set to NO then doxygen will show members with +# their full class and namespace scopes in the documentation. If set to YES the +# scope will be hidden. +# The default value is: NO. HIDE_SCOPE_NAMES = YES -# If the SHOW_INCLUDE_FILES tag is set to YES (the default) then Doxygen -# will put a list of the files that are included by a file in the documentation -# of that file. +# If the SHOW_INCLUDE_FILES tag is set to YES then doxygen will put a list of +# the files that are included by a file in the documentation of that file. +# The default value is: YES. SHOW_INCLUDE_FILES = YES -# If the FORCE_LOCAL_INCLUDES tag is set to YES then Doxygen -# will list include files with double quotes in the documentation -# rather than with sharp brackets. +# If the SHOW_GROUPED_MEMB_INC tag is set to YES then Doxygen will add for each +# grouped member an include statement to the documentation, telling the reader +# which file to include in order to use the member. +# The default value is: NO. + +SHOW_GROUPED_MEMB_INC = NO + +# If the FORCE_LOCAL_INCLUDES tag is set to YES then doxygen will list include +# files with double quotes in the documentation rather than with sharp brackets. +# The default value is: NO. FORCE_LOCAL_INCLUDES = NO -# If the INLINE_INFO tag is set to YES (the default) then a tag [inline] -# is inserted in the documentation for inline members. +# If the INLINE_INFO tag is set to YES then a tag [inline] is inserted in the +# documentation for inline members. +# The default value is: YES. INLINE_INFO = YES -# If the SORT_MEMBER_DOCS tag is set to YES (the default) then doxygen -# will sort the (detailed) documentation of file and class members -# alphabetically by member name. If set to NO the members will appear in -# declaration order. +# If the SORT_MEMBER_DOCS tag is set to YES then doxygen will sort the +# (detailed) documentation of file and class members alphabetically by member +# name. If set to NO the members will appear in declaration order. +# The default value is: YES. SORT_MEMBER_DOCS = YES -# If the SORT_BRIEF_DOCS tag is set to YES then doxygen will sort the -# brief documentation of file, namespace and class members alphabetically -# by member name. If set to NO (the default) the members will appear in -# declaration order. +# If the SORT_BRIEF_DOCS tag is set to YES then doxygen will sort the brief +# descriptions of file, namespace and class members alphabetically by member +# name. If set to NO the members will appear in declaration order. Note that +# this will also influence the order of the classes in the class list. +# The default value is: NO. SORT_BRIEF_DOCS = NO -# If the SORT_MEMBERS_CTORS_1ST tag is set to YES then doxygen -# will sort the (brief and detailed) documentation of class members so that -# constructors and destructors are listed first. If set to NO (the default) -# the constructors will appear in the respective orders defined by -# SORT_MEMBER_DOCS and SORT_BRIEF_DOCS. -# This tag will be ignored for brief docs if SORT_BRIEF_DOCS is set to NO -# and ignored for detailed docs if SORT_MEMBER_DOCS is set to NO. +# If the SORT_MEMBERS_CTORS_1ST tag is set to YES then doxygen will sort the +# (brief and detailed) documentation of class members so that constructors and +# destructors are listed first. If set to NO the constructors will appear in the +# respective orders defined by SORT_BRIEF_DOCS and SORT_MEMBER_DOCS. +# Note: If SORT_BRIEF_DOCS is set to NO this option is ignored for sorting brief +# member documentation. +# Note: If SORT_MEMBER_DOCS is set to NO this option is ignored for sorting +# detailed member documentation. +# The default value is: NO. SORT_MEMBERS_CTORS_1ST = NO -# If the SORT_GROUP_NAMES tag is set to YES then doxygen will sort the -# hierarchy of group names into alphabetical order. If set to NO (the default) -# the group names will appear in their defined order. +# If the SORT_GROUP_NAMES tag is set to YES then doxygen will sort the hierarchy +# of group names into alphabetical order. If set to NO the group names will +# appear in their defined order. +# The default value is: NO. SORT_GROUP_NAMES = NO -# If the SORT_BY_SCOPE_NAME tag is set to YES, the class list will be -# sorted by fully-qualified names, including namespaces. If set to -# NO (the default), the class list will be sorted only by class name, -# not including the namespace part. +# If the SORT_BY_SCOPE_NAME tag is set to YES, the class list will be sorted by +# fully-qualified names, including namespaces. If set to NO, the class list will +# be sorted only by class name, not including the namespace part. # Note: This option is not very useful if HIDE_SCOPE_NAMES is set to YES. -# Note: This option applies only to the class list, not to the -# alphabetical list. +# Note: This option applies only to the class list, not to the alphabetical +# list. +# The default value is: NO. SORT_BY_SCOPE_NAME = NO -# If the STRICT_PROTO_MATCHING option is enabled and doxygen fails to -# do proper type resolution of all parameters of a function it will reject a -# match between the prototype and the implementation of a member function even -# if there is only one candidate or it is obvious which candidate to choose -# by doing a simple string match. By disabling STRICT_PROTO_MATCHING doxygen -# will still accept a match between prototype and implementation in such cases. +# If the STRICT_PROTO_MATCHING option is enabled and doxygen fails to do proper +# type resolution of all parameters of a function it will reject a match between +# the prototype and the implementation of a member function even if there is +# only one candidate or it is obvious which candidate to choose by doing a +# simple string match. By disabling STRICT_PROTO_MATCHING doxygen will still +# accept a match between prototype and implementation in such cases. +# The default value is: NO. STRICT_PROTO_MATCHING = NO -# The GENERATE_TODOLIST tag can be used to enable (YES) or -# disable (NO) the todo list. This list is created by putting \todo -# commands in the documentation. +# The GENERATE_TODOLIST tag can be used to enable ( YES) or disable ( NO) the +# todo list. This list is created by putting \todo commands in the +# documentation. +# The default value is: YES. GENERATE_TODOLIST = YES -# The GENERATE_TESTLIST tag can be used to enable (YES) or -# disable (NO) the test list. This list is created by putting \test -# commands in the documentation. +# The GENERATE_TESTLIST tag can be used to enable ( YES) or disable ( NO) the +# test list. This list is created by putting \test commands in the +# documentation. +# The default value is: YES. GENERATE_TESTLIST = YES -# The GENERATE_BUGLIST tag can be used to enable (YES) or -# disable (NO) the bug list. This list is created by putting \bug -# commands in the documentation. +# The GENERATE_BUGLIST tag can be used to enable ( YES) or disable ( NO) the bug +# list. This list is created by putting \bug commands in the documentation. +# The default value is: YES. GENERATE_BUGLIST = YES -# The GENERATE_DEPRECATEDLIST tag can be used to enable (YES) or -# disable (NO) the deprecated list. This list is created by putting -# \deprecated commands in the documentation. +# The GENERATE_DEPRECATEDLIST tag can be used to enable ( YES) or disable ( NO) +# the deprecated list. This list is created by putting \deprecated commands in +# the documentation. +# The default value is: YES. GENERATE_DEPRECATEDLIST= YES -# The ENABLED_SECTIONS tag can be used to enable conditional -# documentation sections, marked by \if section-label ... \endif -# and \cond section-label ... \endcond blocks. +# The ENABLED_SECTIONS tag can be used to enable conditional documentation +# sections, marked by \if ... \endif and \cond +# ... \endcond blocks. ENABLED_SECTIONS = -# The MAX_INITIALIZER_LINES tag determines the maximum number of lines -# the initial value of a variable or macro consists of for it to appear in -# the documentation. If the initializer consists of more lines than specified -# here it will be hidden. Use a value of 0 to hide initializers completely. -# The appearance of the initializer of individual variables and macros in the -# documentation can be controlled using \showinitializer or \hideinitializer -# command in the documentation regardless of this setting. +# The MAX_INITIALIZER_LINES tag determines the maximum number of lines that the +# initial value of a variable or macro / define can have for it to appear in the +# documentation. If the initializer consists of more lines than specified here +# it will be hidden. Use a value of 0 to hide initializers completely. The +# appearance of the value of individual variables and macros / defines can be +# controlled using \showinitializer or \hideinitializer command in the +# documentation regardless of this setting. +# Minimum value: 0, maximum value: 10000, default value: 30. MAX_INITIALIZER_LINES = 30 -# Set the SHOW_USED_FILES tag to NO to disable the list of files generated -# at the bottom of the documentation of classes and structs. If set to YES the -# list will mention the files that were used to generate the documentation. +# Set the SHOW_USED_FILES tag to NO to disable the list of files generated at +# the bottom of the documentation of classes and structs. If set to YES the list +# will mention the files that were used to generate the documentation. +# The default value is: YES. SHOW_USED_FILES = YES -# Set the SHOW_FILES tag to NO to disable the generation of the Files page. -# This will remove the Files entry from the Quick Index and from the -# Folder Tree View (if specified). The default is YES. +# Set the SHOW_FILES tag to NO to disable the generation of the Files page. This +# will remove the Files entry from the Quick Index and from the Folder Tree View +# (if specified). +# The default value is: YES. SHOW_FILES = YES -# Set the SHOW_NAMESPACES tag to NO to disable the generation of the -# Namespaces page. -# This will remove the Namespaces entry from the Quick Index -# and from the Folder Tree View (if specified). The default is YES. +# Set the SHOW_NAMESPACES tag to NO to disable the generation of the Namespaces +# page. This will remove the Namespaces entry from the Quick Index and from the +# Folder Tree View (if specified). +# The default value is: YES. SHOW_NAMESPACES = YES # The FILE_VERSION_FILTER tag can be used to specify a program or script that # doxygen should invoke to get the current version for each file (typically from # the version control system). Doxygen will invoke the program by executing (via -# popen()) the command , where is the value of -# the FILE_VERSION_FILTER tag, and is the name of an input file -# provided by doxygen. Whatever the program writes to standard output -# is used as the file version. See the manual for examples. +# popen()) the command command input-file, where command is the value of the +# FILE_VERSION_FILTER tag, and input-file is the name of an input file provided +# by doxygen. Whatever the program writes to standard output is used as the file +# version. For an example see the documentation. FILE_VERSION_FILTER = # The LAYOUT_FILE tag can be used to specify a layout file which will be parsed # by doxygen. The layout file controls the global structure of the generated # output files in an output format independent way. To create the layout file -# that represents doxygen's defaults, run doxygen with the -l option. -# You can optionally specify a file name after the option, if omitted -# DoxygenLayout.xml will be used as the name of the layout file. +# that represents doxygen's defaults, run doxygen with the -l option. You can +# optionally specify a file name after the option, if omitted DoxygenLayout.xml +# will be used as the name of the layout file. +# +# Note that if you run doxygen from a directory containing a file called +# DoxygenLayout.xml, doxygen will parse it automatically even if the LAYOUT_FILE +# tag is left empty. LAYOUT_FILE = -# The CITE_BIB_FILES tag can be used to specify one or more bib files -# containing the references data. This must be a list of .bib files. The -# .bib extension is automatically appended if omitted. Using this command -# requires the bibtex tool to be installed. See also -# http://en.wikipedia.org/wiki/BibTeX for more info. For LaTeX the style -# of the bibliography can be controlled using LATEX_BIB_STYLE. To use this -# feature you need bibtex and perl available in the search path. Do not use -# file names with spaces, bibtex cannot handle them. +# The CITE_BIB_FILES tag can be used to specify one or more bib files containing +# the reference definitions. This must be a list of .bib files. The .bib +# extension is automatically appended if omitted. This requires the bibtex tool +# to be installed. See also http://en.wikipedia.org/wiki/BibTeX for more info. +# For LaTeX the style of the bibliography can be controlled using +# LATEX_BIB_STYLE. To use this feature you need bibtex and perl available in the +# search path. Do not use file names with spaces, bibtex cannot handle them. See +# also \cite for info how to create references. CITE_BIB_FILES = #--------------------------------------------------------------------------- -# configuration options related to warning and progress messages +# Configuration options related to warning and progress messages #--------------------------------------------------------------------------- -# The QUIET tag can be used to turn on/off the messages that are generated -# by doxygen. Possible values are YES and NO. If left blank NO is used. +# The QUIET tag can be used to turn on/off the messages that are generated to +# standard output by doxygen. If QUIET is set to YES this implies that the +# messages are off. +# The default value is: NO. QUIET = YES # The WARNINGS tag can be used to turn on/off the warning messages that are -# generated by doxygen. Possible values are YES and NO. If left blank -# NO is used. +# generated to standard error ( stderr) by doxygen. If WARNINGS is set to YES +# this implies that the warnings are on. +# +# Tip: Turn warnings on while writing the documentation. +# The default value is: YES. WARNINGS = YES -# If WARN_IF_UNDOCUMENTED is set to YES, then doxygen will generate warnings -# for undocumented members. If EXTRACT_ALL is set to YES then this flag will -# automatically be disabled. +# If the WARN_IF_UNDOCUMENTED tag is set to YES, then doxygen will generate +# warnings for undocumented members. If EXTRACT_ALL is set to YES then this flag +# will automatically be disabled. +# The default value is: YES. WARN_IF_UNDOCUMENTED = NO -# If WARN_IF_DOC_ERROR is set to YES, doxygen will generate warnings for -# potential errors in the documentation, such as not documenting some -# parameters in a documented function, or documenting parameters that -# don't exist or using markup commands wrongly. +# If the WARN_IF_DOC_ERROR tag is set to YES, doxygen will generate warnings for +# potential errors in the documentation, such as not documenting some parameters +# in a documented function, or documenting parameters that don't exist or using +# markup commands wrongly. +# The default value is: YES. WARN_IF_DOC_ERROR = YES -# The WARN_NO_PARAMDOC option can be enabled to get warnings for -# functions that are documented, but have no documentation for their parameters -# or return value. If set to NO (the default) doxygen will only warn about -# wrong or incomplete parameter documentation, but not about the absence of -# documentation. +# This WARN_NO_PARAMDOC option can be enabled to get warnings for functions that +# are documented, but have no documentation for their parameters or return +# value. If set to NO doxygen will only warn about wrong or incomplete parameter +# documentation, but not about the absence of documentation. +# The default value is: NO. WARN_NO_PARAMDOC = NO -# The WARN_FORMAT tag determines the format of the warning messages that -# doxygen can produce. The string should contain the $file, $line, and $text -# tags, which will be replaced by the file and line number from which the -# warning originated and the warning text. Optionally the format may contain -# $version, which will be replaced by the version of the file (if it could -# be obtained via FILE_VERSION_FILTER) +# The WARN_FORMAT tag determines the format of the warning messages that doxygen +# can produce. The string should contain the $file, $line, and $text tags, which +# will be replaced by the file and line number from which the warning originated +# and the warning text. Optionally the format may contain $version, which will +# be replaced by the version of the file (if it could be obtained via +# FILE_VERSION_FILTER) +# The default value is: $file:$line: $text. WARN_FORMAT = "$file:$line: $text" -# The WARN_LOGFILE tag can be used to specify a file to which warning -# and error messages should be written. If left blank the output is written -# to stderr. +# The WARN_LOGFILE tag can be used to specify a file to which warning and error +# messages should be written. If left blank the output is written to standard +# error (stderr). WARN_LOGFILE = #--------------------------------------------------------------------------- -# configuration options related to the input files +# Configuration options related to the input files #--------------------------------------------------------------------------- -# The INPUT tag can be used to specify the files and/or directories that contain -# documented source files. You may enter file names like "myfile.cpp" or -# directories like "/usr/src/myproject". Separate the files or directories -# with spaces. +# The INPUT tag is used to specify the files and/or directories that contain +# documented source files. You may enter file names like myfile.cpp or +# directories like /usr/src/myproject. Separate the files or directories with +# spaces. +# Note: If this tag is empty the current directory is searched. INPUT = ../apt-pkg # This tag can be used to specify the character encoding of the source files -# that doxygen parses. Internally doxygen uses the UTF-8 encoding, which is -# also the default input encoding. Doxygen uses libiconv (or the iconv built -# into libc) for the transcoding. See http://www.gnu.org/software/libiconv for -# the list of possible encodings. +# that doxygen parses. Internally doxygen uses the UTF-8 encoding. Doxygen uses +# libiconv (or the iconv built into libc) for the transcoding. See the libiconv +# documentation (see: http://www.gnu.org/software/libiconv) for the list of +# possible encodings. +# The default value is: UTF-8. INPUT_ENCODING = UTF-8 # If the value of the INPUT tag contains directories, you can use the -# FILE_PATTERNS tag to specify one or more wildcard pattern (like *.cpp -# and *.h) to filter out the source-files in the directories. If left -# blank the following patterns are tested: -# *.c *.cc *.cxx *.cpp *.c++ *.d *.java *.ii *.ixx *.ipp *.i++ *.inl *.h *.hh -# *.hxx *.hpp *.h++ *.idl *.odl *.cs *.php *.php3 *.inc *.m *.mm *.dox *.py -# *.f90 *.f *.for *.vhd *.vhdl +# FILE_PATTERNS tag to specify one or more wildcard patterns (like *.cpp and +# *.h) to filter out the source-files in the directories. If left blank the +# following patterns are tested:*.c, *.cc, *.cxx, *.cpp, *.c++, *.java, *.ii, +# *.ixx, *.ipp, *.i++, *.inl, *.idl, *.ddl, *.odl, *.h, *.hh, *.hxx, *.hpp, +# *.h++, *.cs, *.d, *.php, *.php4, *.php5, *.phtml, *.inc, *.m, *.markdown, +# *.md, *.mm, *.dox, *.py, *.f90, *.f, *.for, *.tcl, *.vhd, *.vhdl, *.ucf, +# *.qsf, *.as and *.js. FILE_PATTERNS = *.cc \ *.h -# The RECURSIVE tag can be used to turn specify whether or not subdirectories -# should be searched for input files as well. Possible values are YES and NO. -# If left blank NO is used. +# The RECURSIVE tag can be used to specify whether or not subdirectories should +# be searched for input files as well. +# The default value is: NO. RECURSIVE = YES # The EXCLUDE tag can be used to specify files and/or directories that should be # excluded from the INPUT source files. This way you can easily exclude a # subdirectory from a directory tree whose root is specified with the INPUT tag. +# # Note that relative paths are relative to the directory from which doxygen is # run. @@ -692,14 +795,16 @@ EXCLUDE = # The EXCLUDE_SYMLINKS tag can be used to select whether or not files or # directories that are symbolic links (a Unix file system feature) are excluded # from the input. +# The default value is: NO. EXCLUDE_SYMLINKS = NO # If the value of the INPUT tag contains directories, you can use the # EXCLUDE_PATTERNS tag to specify one or more wildcard patterns to exclude -# certain files from those directories. Note that the wildcards are matched -# against the file with absolute path, so to exclude all test directories -# for example use the pattern */test/* +# certain files from those directories. +# +# Note that the wildcards are matched against the file with absolute path, so to +# exclude all test directories for example use the pattern */test/* EXCLUDE_PATTERNS = @@ -708,42 +813,49 @@ EXCLUDE_PATTERNS = # output. The symbol name can be a fully qualified name, a word, or if the # wildcard * is used, a substring. Examples: ANamespace, AClass, # AClass::ANamespace, ANamespace::*Test +# +# Note that the wildcards are matched against the file with absolute path, so to +# exclude all test directories use the pattern */test/* EXCLUDE_SYMBOLS = -# The EXAMPLE_PATH tag can be used to specify one or more files or -# directories that contain example code fragments that are included (see -# the \include command). +# The EXAMPLE_PATH tag can be used to specify one or more files or directories +# that contain example code fragments that are included (see the \include +# command). EXAMPLE_PATH = # If the value of the EXAMPLE_PATH tag contains directories, you can use the -# EXAMPLE_PATTERNS tag to specify one or more wildcard pattern (like *.cpp -# and *.h) to filter out the source-files in the directories. If left -# blank all files are included. +# EXAMPLE_PATTERNS tag to specify one or more wildcard pattern (like *.cpp and +# *.h) to filter out the source-files in the directories. If left blank all +# files are included. EXAMPLE_PATTERNS = # If the EXAMPLE_RECURSIVE tag is set to YES then subdirectories will be -# searched for input files to be used with the \include or \dontinclude -# commands irrespective of the value of the RECURSIVE tag. -# Possible values are YES and NO. If left blank NO is used. +# searched for input files to be used with the \include or \dontinclude commands +# irrespective of the value of the RECURSIVE tag. +# The default value is: NO. EXAMPLE_RECURSIVE = NO -# The IMAGE_PATH tag can be used to specify one or more files or -# directories that contain image that are included in the documentation (see -# the \image command). +# The IMAGE_PATH tag can be used to specify one or more files or directories +# that contain images that are to be included in the documentation (see the +# \image command). IMAGE_PATH = # The INPUT_FILTER tag can be used to specify a program that doxygen should # invoke to filter for each input file. Doxygen will invoke the filter program -# by executing (via popen()) the command , where -# is the value of the INPUT_FILTER tag, and is the name of an -# input file. Doxygen will then use the output that the filter program writes -# to standard output. -# If FILTER_PATTERNS is specified, this tag will be ignored. +# by executing (via popen()) the command: +# +# +# +# where is the value of the INPUT_FILTER tag, and is the +# name of an input file. Doxygen will then use the output that the filter +# program writes to standard output. If FILTER_PATTERNS is specified, this tag +# will be ignored. +# # Note that the filter must not add or remove lines; it is applied before the # code is scanned, but not when the output code is generated. If lines are added # or removed, the anchors will not be placed correctly. @@ -751,172 +863,222 @@ IMAGE_PATH = INPUT_FILTER = "sed -e 's#//[ ]*FIXME:\?#/// \\todo#'" # The FILTER_PATTERNS tag can be used to specify filters on a per file pattern -# basis. -# Doxygen will compare the file name with each pattern and apply the -# filter if there is a match. -# The filters are a list of the form: -# pattern=filter (like *.cpp=my_cpp_filter). See INPUT_FILTER for further -# info on how filters are used. If FILTER_PATTERNS is empty or if -# non of the patterns match the file name, INPUT_FILTER is applied. +# basis. Doxygen will compare the file name with each pattern and apply the +# filter if there is a match. The filters are a list of the form: pattern=filter +# (like *.cpp=my_cpp_filter). See INPUT_FILTER for further information on how +# filters are used. If the FILTER_PATTERNS tag is empty or if none of the +# patterns match the file name, INPUT_FILTER is applied. FILTER_PATTERNS = # If the FILTER_SOURCE_FILES tag is set to YES, the input filter (if set using -# INPUT_FILTER) will be used to filter the input files when producing source -# files to browse (i.e. when SOURCE_BROWSER is set to YES). +# INPUT_FILTER ) will also be used to filter the input files that are used for +# producing the source files to browse (i.e. when SOURCE_BROWSER is set to YES). +# The default value is: NO. FILTER_SOURCE_FILES = NO # The FILTER_SOURCE_PATTERNS tag can be used to specify source filters per file -# pattern. A pattern will override the setting for FILTER_PATTERN (if any) -# and it is also possible to disable source filtering for a specific pattern -# using *.ext= (so without naming a filter). This option only has effect when -# FILTER_SOURCE_FILES is enabled. +# pattern. A pattern will override the setting for FILTER_PATTERN (if any) and +# it is also possible to disable source filtering for a specific pattern using +# *.ext= (so without naming a filter). +# This tag requires that the tag FILTER_SOURCE_FILES is set to YES. FILTER_SOURCE_PATTERNS = -# If the USE_MD_FILE_AS_MAINPAGE tag refers to the name of a markdown file that +# If the USE_MDFILE_AS_MAINPAGE tag refers to the name of a markdown file that # is part of the input, its contents will be placed on the main page # (index.html). This can be useful if you have a project on for instance GitHub -# and want reuse the introduction page also for the doxygen output. +# and want to reuse the introduction page also for the doxygen output. USE_MDFILE_AS_MAINPAGE = #--------------------------------------------------------------------------- -# configuration options related to source browsing +# Configuration options related to source browsing #--------------------------------------------------------------------------- -# If the SOURCE_BROWSER tag is set to YES then a list of source files will -# be generated. Documented entities will be cross-referenced with these sources. -# Note: To get rid of all source code in the generated output, make sure also -# VERBATIM_HEADERS is set to NO. +# If the SOURCE_BROWSER tag is set to YES then a list of source files will be +# generated. Documented entities will be cross-referenced with these sources. +# +# Note: To get rid of all source code in the generated output, make sure that +# also VERBATIM_HEADERS is set to NO. +# The default value is: NO. SOURCE_BROWSER = NO -# Setting the INLINE_SOURCES tag to YES will include the body -# of functions and classes directly in the documentation. +# Setting the INLINE_SOURCES tag to YES will include the body of functions, +# classes and enums directly into the documentation. +# The default value is: NO. INLINE_SOURCES = NO -# Setting the STRIP_CODE_COMMENTS tag to YES (the default) will instruct -# doxygen to hide any special comment blocks from generated source code -# fragments. Normal C, C++ and Fortran comments will always remain visible. +# Setting the STRIP_CODE_COMMENTS tag to YES will instruct doxygen to hide any +# special comment blocks from generated source code fragments. Normal C, C++ and +# Fortran comments will always remain visible. +# The default value is: YES. STRIP_CODE_COMMENTS = YES -# If the REFERENCED_BY_RELATION tag is set to YES -# then for each documented function all documented -# functions referencing it will be listed. +# If the REFERENCED_BY_RELATION tag is set to YES then for each documented +# function all documented functions referencing it will be listed. +# The default value is: NO. REFERENCED_BY_RELATION = YES -# If the REFERENCES_RELATION tag is set to YES -# then for each documented function all documented entities -# called/used by that function will be listed. +# If the REFERENCES_RELATION tag is set to YES then for each documented function +# all documented entities called/used by that function will be listed. +# The default value is: NO. REFERENCES_RELATION = YES -# If the REFERENCES_LINK_SOURCE tag is set to YES (the default) -# and SOURCE_BROWSER tag is set to YES, then the hyperlinks from -# functions in REFERENCES_RELATION and REFERENCED_BY_RELATION lists will -# link to the source code. -# Otherwise they will link to the documentation. +# If the REFERENCES_LINK_SOURCE tag is set to YES and SOURCE_BROWSER tag is set +# to YES, then the hyperlinks from functions in REFERENCES_RELATION and +# REFERENCED_BY_RELATION lists will link to the source code. Otherwise they will +# link to the documentation. +# The default value is: YES. REFERENCES_LINK_SOURCE = YES -# If the USE_HTAGS tag is set to YES then the references to source code -# will point to the HTML generated by the htags(1) tool instead of doxygen -# built-in source browser. The htags tool is part of GNU's global source -# tagging system (see http://www.gnu.org/software/global/global.html). You -# will need version 4.8.6 or higher. +# If SOURCE_TOOLTIPS is enabled (the default) then hovering a hyperlink in the +# source code will show a tooltip with additional information such as prototype, +# brief description and links to the definition and documentation. Since this +# will make the HTML file larger and loading of large files a bit slower, you +# can opt to disable this feature. +# The default value is: YES. +# This tag requires that the tag SOURCE_BROWSER is set to YES. + +SOURCE_TOOLTIPS = YES + +# If the USE_HTAGS tag is set to YES then the references to source code will +# point to the HTML generated by the htags(1) tool instead of doxygen built-in +# source browser. The htags tool is part of GNU's global source tagging system +# (see http://www.gnu.org/software/global/global.html). You will need version +# 4.8.6 or higher. +# +# To use it do the following: +# - Install the latest version of global +# - Enable SOURCE_BROWSER and USE_HTAGS in the config file +# - Make sure the INPUT points to the root of the source tree +# - Run doxygen as normal +# +# Doxygen will invoke htags (and that will in turn invoke gtags), so these +# tools must be available from the command line (i.e. in the search path). +# +# The result: instead of the source browser generated by doxygen, the links to +# source code will now point to the output of htags. +# The default value is: NO. +# This tag requires that the tag SOURCE_BROWSER is set to YES. USE_HTAGS = NO -# If the VERBATIM_HEADERS tag is set to YES (the default) then Doxygen -# will generate a verbatim copy of the header file for each class for -# which an include is specified. Set to NO to disable this. +# If the VERBATIM_HEADERS tag is set the YES then doxygen will generate a +# verbatim copy of the header file for each class for which an include is +# specified. Set to NO to disable this. +# See also: Section \class. +# The default value is: YES. VERBATIM_HEADERS = YES #--------------------------------------------------------------------------- -# configuration options related to the alphabetical class index +# Configuration options related to the alphabetical class index #--------------------------------------------------------------------------- -# If the ALPHABETICAL_INDEX tag is set to YES, an alphabetical index -# of all compounds will be generated. Enable this if the project -# contains a lot of classes, structs, unions or interfaces. +# If the ALPHABETICAL_INDEX tag is set to YES, an alphabetical index of all +# compounds will be generated. Enable this if the project contains a lot of +# classes, structs, unions or interfaces. +# The default value is: YES. ALPHABETICAL_INDEX = NO -# If the alphabetical index is enabled (see ALPHABETICAL_INDEX) then -# the COLS_IN_ALPHA_INDEX tag can be used to specify the number of columns -# in which this list will be split (can be a number in the range [1..20]) +# The COLS_IN_ALPHA_INDEX tag can be used to specify the number of columns in +# which the alphabetical index list will be split. +# Minimum value: 1, maximum value: 20, default value: 5. +# This tag requires that the tag ALPHABETICAL_INDEX is set to YES. COLS_IN_ALPHA_INDEX = 5 -# In case all classes in a project start with a common prefix, all -# classes will be put under the same header in the alphabetical index. -# The IGNORE_PREFIX tag can be used to specify one or more prefixes that -# should be ignored while generating the index headers. +# In case all classes in a project start with a common prefix, all classes will +# be put under the same header in the alphabetical index. The IGNORE_PREFIX tag +# can be used to specify a prefix (or a list of prefixes) that should be ignored +# while generating the index headers. +# This tag requires that the tag ALPHABETICAL_INDEX is set to YES. IGNORE_PREFIX = #--------------------------------------------------------------------------- -# configuration options related to the HTML output +# Configuration options related to the HTML output #--------------------------------------------------------------------------- -# If the GENERATE_HTML tag is set to YES (the default) Doxygen will -# generate HTML output. +# If the GENERATE_HTML tag is set to YES doxygen will generate HTML output +# The default value is: YES. GENERATE_HTML = YES -# The HTML_OUTPUT tag is used to specify where the HTML docs will be put. -# If a relative path is entered the value of OUTPUT_DIRECTORY will be -# put in front of it. If left blank `html' will be used as the default path. +# The HTML_OUTPUT tag is used to specify where the HTML docs will be put. If a +# relative path is entered the value of OUTPUT_DIRECTORY will be put in front of +# it. +# The default directory is: html. +# This tag requires that the tag GENERATE_HTML is set to YES. HTML_OUTPUT = html -# The HTML_FILE_EXTENSION tag can be used to specify the file extension for -# each generated HTML page (for example: .htm,.php,.asp). If it is left blank -# doxygen will generate files with .html extension. +# The HTML_FILE_EXTENSION tag can be used to specify the file extension for each +# generated HTML page (for example: .htm, .php, .asp). +# The default value is: .html. +# This tag requires that the tag GENERATE_HTML is set to YES. HTML_FILE_EXTENSION = .xhtml -# The HTML_HEADER tag can be used to specify a personal HTML header for -# each generated HTML page. If it is left blank doxygen will generate a -# standard header. Note that when using a custom header you are responsible -# for the proper inclusion of any scripts and style sheets that doxygen -# needs, which is dependent on the configuration options used. -# It is advised to generate a default header using "doxygen -w html -# header.html footer.html stylesheet.css YourConfigFile" and then modify -# that header. Note that the header is subject to change so you typically -# have to redo this when upgrading to a newer version of doxygen or when -# changing the value of configuration settings such as GENERATE_TREEVIEW! +# The HTML_HEADER tag can be used to specify a user-defined HTML header file for +# each generated HTML page. If the tag is left blank doxygen will generate a +# standard header. +# +# To get valid HTML the header file that includes any scripts and style sheets +# that doxygen needs, which is dependent on the configuration options used (e.g. +# the setting GENERATE_TREEVIEW). It is highly recommended to start with a +# default header using +# doxygen -w html new_header.html new_footer.html new_stylesheet.css +# YourConfigFile +# and then modify the file new_header.html. See also section "Doxygen usage" +# for information on how to generate the default header that doxygen normally +# uses. +# Note: The header is subject to change so you typically have to regenerate the +# default header when upgrading to a newer version of doxygen. For a description +# of the possible markers and block names see the documentation. +# This tag requires that the tag GENERATE_HTML is set to YES. HTML_HEADER = -# The HTML_FOOTER tag can be used to specify a personal HTML footer for -# each generated HTML page. If it is left blank doxygen will generate a -# standard footer. +# The HTML_FOOTER tag can be used to specify a user-defined HTML footer for each +# generated HTML page. If the tag is left blank doxygen will generate a standard +# footer. See HTML_HEADER for more information on how to generate a default +# footer and what special commands can be used inside the footer. See also +# section "Doxygen usage" for information on how to generate the default footer +# that doxygen normally uses. +# This tag requires that the tag GENERATE_HTML is set to YES. HTML_FOOTER = -# The HTML_STYLESHEET tag can be used to specify a user-defined cascading -# style sheet that is used by each HTML page. It can be used to -# fine-tune the look of the HTML output. If left blank doxygen will -# generate a default style sheet. Note that it is recommended to use -# HTML_EXTRA_STYLESHEET instead of this one, as it is more robust and this -# tag will in the future become obsolete. +# The HTML_STYLESHEET tag can be used to specify a user-defined cascading style +# sheet that is used by each HTML page. It can be used to fine-tune the look of +# the HTML output. If left blank doxygen will generate a default style sheet. +# See also section "Doxygen usage" for information on how to generate the style +# sheet that doxygen normally uses. +# Note: It is recommended to use HTML_EXTRA_STYLESHEET instead of this tag, as +# it is more robust and this tag (HTML_STYLESHEET) will in the future become +# obsolete. +# This tag requires that the tag GENERATE_HTML is set to YES. HTML_STYLESHEET = -# The HTML_EXTRA_STYLESHEET tag can be used to specify an additional -# user-defined cascading style sheet that is included after the standard -# style sheets created by doxygen. Using this option one can overrule -# certain style aspects. This is preferred over using HTML_STYLESHEET -# since it does not replace the standard style sheet and is therefor more -# robust against future updates. Doxygen will copy the style sheet file to -# the output directory. +# The HTML_EXTRA_STYLESHEET tag can be used to specify an additional user- +# defined cascading style sheet that is included after the standard style sheets +# created by doxygen. Using this option one can overrule certain style aspects. +# This is preferred over using HTML_STYLESHEET since it does not replace the +# standard style sheet and is therefor more robust against future updates. +# Doxygen will copy the style sheet file to the output directory. For an example +# see the documentation. +# This tag requires that the tag GENERATE_HTML is set to YES. HTML_EXTRA_STYLESHEET = @@ -924,632 +1086,803 @@ HTML_EXTRA_STYLESHEET = # other source files which should be copied to the HTML output directory. Note # that these files will be copied to the base HTML output directory. Use the # $relpath^ marker in the HTML_HEADER and/or HTML_FOOTER files to load these -# files. In the HTML_STYLESHEET file, use the file name only. Also note that -# the files will be copied as-is; there are no commands or markers available. +# files. In the HTML_STYLESHEET file, use the file name only. Also note that the +# files will be copied as-is; there are no commands or markers available. +# This tag requires that the tag GENERATE_HTML is set to YES. HTML_EXTRA_FILES = -# The HTML_COLORSTYLE_HUE tag controls the color of the HTML output. -# Doxygen will adjust the colors in the style sheet and background images -# according to this color. Hue is specified as an angle on a colorwheel, -# see http://en.wikipedia.org/wiki/Hue for more information. -# For instance the value 0 represents red, 60 is yellow, 120 is green, -# 180 is cyan, 240 is blue, 300 purple, and 360 is red again. -# The allowed range is 0 to 359. +# The HTML_COLORSTYLE_HUE tag controls the color of the HTML output. Doxygen +# will adjust the colors in the stylesheet and background images according to +# this color. Hue is specified as an angle on a colorwheel, see +# http://en.wikipedia.org/wiki/Hue for more information. For instance the value +# 0 represents red, 60 is yellow, 120 is green, 180 is cyan, 240 is blue, 300 +# purple, and 360 is red again. +# Minimum value: 0, maximum value: 359, default value: 220. +# This tag requires that the tag GENERATE_HTML is set to YES. HTML_COLORSTYLE_HUE = 220 -# The HTML_COLORSTYLE_SAT tag controls the purity (or saturation) of -# the colors in the HTML output. For a value of 0 the output will use -# grayscales only. A value of 255 will produce the most vivid colors. +# The HTML_COLORSTYLE_SAT tag controls the purity (or saturation) of the colors +# in the HTML output. For a value of 0 the output will use grayscales only. A +# value of 255 will produce the most vivid colors. +# Minimum value: 0, maximum value: 255, default value: 100. +# This tag requires that the tag GENERATE_HTML is set to YES. HTML_COLORSTYLE_SAT = 100 -# The HTML_COLORSTYLE_GAMMA tag controls the gamma correction applied to -# the luminance component of the colors in the HTML output. Values below -# 100 gradually make the output lighter, whereas values above 100 make -# the output darker. The value divided by 100 is the actual gamma applied, -# so 80 represents a gamma of 0.8, The value 220 represents a gamma of 2.2, -# and 100 does not change the gamma. +# The HTML_COLORSTYLE_GAMMA tag controls the gamma correction applied to the +# luminance component of the colors in the HTML output. Values below 100 +# gradually make the output lighter, whereas values above 100 make the output +# darker. The value divided by 100 is the actual gamma applied, so 80 represents +# a gamma of 0.8, The value 220 represents a gamma of 2.2, and 100 does not +# change the gamma. +# Minimum value: 40, maximum value: 240, default value: 80. +# This tag requires that the tag GENERATE_HTML is set to YES. HTML_COLORSTYLE_GAMMA = 80 # If the HTML_TIMESTAMP tag is set to YES then the footer of each generated HTML -# page will contain the date and time when the page was generated. Setting -# this to NO can help when comparing the output of multiple runs. +# page will contain the date and time when the page was generated. Setting this +# to NO can help when comparing the output of multiple runs. +# The default value is: YES. +# This tag requires that the tag GENERATE_HTML is set to YES. HTML_TIMESTAMP = YES # If the HTML_DYNAMIC_SECTIONS tag is set to YES then the generated HTML # documentation will contain sections that can be hidden and shown after the # page has loaded. +# The default value is: NO. +# This tag requires that the tag GENERATE_HTML is set to YES. HTML_DYNAMIC_SECTIONS = NO -# With HTML_INDEX_NUM_ENTRIES one can control the preferred number of -# entries shown in the various tree structured indices initially; the user -# can expand and collapse entries dynamically later on. Doxygen will expand -# the tree to such a level that at most the specified number of entries are -# visible (unless a fully collapsed tree already exceeds this amount). -# So setting the number of entries 1 will produce a full collapsed tree by -# default. 0 is a special value representing an infinite number of entries -# and will result in a full expanded tree by default. +# With HTML_INDEX_NUM_ENTRIES one can control the preferred number of entries +# shown in the various tree structured indices initially; the user can expand +# and collapse entries dynamically later on. Doxygen will expand the tree to +# such a level that at most the specified number of entries are visible (unless +# a fully collapsed tree already exceeds this amount). So setting the number of +# entries 1 will produce a full collapsed tree by default. 0 is a special value +# representing an infinite number of entries and will result in a full expanded +# tree by default. +# Minimum value: 0, maximum value: 9999, default value: 100. +# This tag requires that the tag GENERATE_HTML is set to YES. HTML_INDEX_NUM_ENTRIES = 100 -# If the GENERATE_DOCSET tag is set to YES, additional index files -# will be generated that can be used as input for Apple's Xcode 3 -# integrated development environment, introduced with OSX 10.5 (Leopard). -# To create a documentation set, doxygen will generate a Makefile in the -# HTML output directory. Running make will produce the docset in that -# directory and running "make install" will install the docset in -# ~/Library/Developer/Shared/Documentation/DocSets so that Xcode will find -# it at startup. -# See http://developer.apple.com/tools/creatingdocsetswithdoxygen.html +# If the GENERATE_DOCSET tag is set to YES, additional index files will be +# generated that can be used as input for Apple's Xcode 3 integrated development +# environment (see: http://developer.apple.com/tools/xcode/), introduced with +# OSX 10.5 (Leopard). To create a documentation set, doxygen will generate a +# Makefile in the HTML output directory. Running make will produce the docset in +# that directory and running make install will install the docset in +# ~/Library/Developer/Shared/Documentation/DocSets so that Xcode will find it at +# startup. See http://developer.apple.com/tools/creatingdocsetswithdoxygen.html # for more information. +# The default value is: NO. +# This tag requires that the tag GENERATE_HTML is set to YES. GENERATE_DOCSET = NO -# When GENERATE_DOCSET tag is set to YES, this tag determines the name of the -# feed. A documentation feed provides an umbrella under which multiple -# documentation sets from a single provider (such as a company or product suite) -# can be grouped. +# This tag determines the name of the docset feed. A documentation feed provides +# an umbrella under which multiple documentation sets from a single provider +# (such as a company or product suite) can be grouped. +# The default value is: Doxygen generated docs. +# This tag requires that the tag GENERATE_DOCSET is set to YES. DOCSET_FEEDNAME = "Doxygen generated docs" -# When GENERATE_DOCSET tag is set to YES, this tag specifies a string that -# should uniquely identify the documentation set bundle. This should be a -# reverse domain-name style string, e.g. com.mycompany.MyDocSet. Doxygen -# will append .docset to the name. +# This tag specifies a string that should uniquely identify the documentation +# set bundle. This should be a reverse domain-name style string, e.g. +# com.mycompany.MyDocSet. Doxygen will append .docset to the name. +# The default value is: org.doxygen.Project. +# This tag requires that the tag GENERATE_DOCSET is set to YES. DOCSET_BUNDLE_ID = org.doxygen.Project -# When GENERATE_PUBLISHER_ID tag specifies a string that should uniquely -# identify the documentation publisher. This should be a reverse domain-name -# style string, e.g. com.mycompany.MyDocSet.documentation. +# The DOCSET_PUBLISHER_ID tag specifies a string that should uniquely identify +# the documentation publisher. This should be a reverse domain-name style +# string, e.g. com.mycompany.MyDocSet.documentation. +# The default value is: org.doxygen.Publisher. +# This tag requires that the tag GENERATE_DOCSET is set to YES. DOCSET_PUBLISHER_ID = org.doxygen.Publisher -# The GENERATE_PUBLISHER_NAME tag identifies the documentation publisher. +# The DOCSET_PUBLISHER_NAME tag identifies the documentation publisher. +# The default value is: Publisher. +# This tag requires that the tag GENERATE_DOCSET is set to YES. DOCSET_PUBLISHER_NAME = Publisher -# If the GENERATE_HTMLHELP tag is set to YES, additional index files -# will be generated that can be used as input for tools like the -# Microsoft HTML help workshop to generate a compiled HTML help file (.chm) -# of the generated HTML documentation. +# If the GENERATE_HTMLHELP tag is set to YES then doxygen generates three +# additional HTML index files: index.hhp, index.hhc, and index.hhk. The +# index.hhp is a project file that can be read by Microsoft's HTML Help Workshop +# (see: http://www.microsoft.com/en-us/download/details.aspx?id=21138) on +# Windows. +# +# The HTML Help Workshop contains a compiler that can convert all HTML output +# generated by doxygen into a single compiled HTML file (.chm). Compiled HTML +# files are now used as the Windows 98 help format, and will replace the old +# Windows help format (.hlp) on all Windows platforms in the future. Compressed +# HTML files also contain an index, a table of contents, and you can search for +# words in the documentation. The HTML workshop also contains a viewer for +# compressed HTML files. +# The default value is: NO. +# This tag requires that the tag GENERATE_HTML is set to YES. GENERATE_HTMLHELP = NO -# If the GENERATE_HTMLHELP tag is set to YES, the CHM_FILE tag can -# be used to specify the file name of the resulting .chm file. You -# can add a path in front of the file if the result should not be +# The CHM_FILE tag can be used to specify the file name of the resulting .chm +# file. You can add a path in front of the file if the result should not be # written to the html output directory. +# This tag requires that the tag GENERATE_HTMLHELP is set to YES. CHM_FILE = -# If the GENERATE_HTMLHELP tag is set to YES, the HHC_LOCATION tag can -# be used to specify the location (absolute path including file name) of -# the HTML help compiler (hhc.exe). If non-empty doxygen will try to run -# the HTML help compiler on the generated index.hhp. +# The HHC_LOCATION tag can be used to specify the location (absolute path +# including file name) of the HTML help compiler ( hhc.exe). If non-empty +# doxygen will try to run the HTML help compiler on the generated index.hhp. +# The file has to be specified with full path. +# This tag requires that the tag GENERATE_HTMLHELP is set to YES. HHC_LOCATION = -# If the GENERATE_HTMLHELP tag is set to YES, the GENERATE_CHI flag -# controls if a separate .chi index file is generated (YES) or that -# it should be included in the master .chm file (NO). +# The GENERATE_CHI flag controls if a separate .chi index file is generated ( +# YES) or that it should be included in the master .chm file ( NO). +# The default value is: NO. +# This tag requires that the tag GENERATE_HTMLHELP is set to YES. GENERATE_CHI = NO -# If the GENERATE_HTMLHELP tag is set to YES, the CHM_INDEX_ENCODING -# is used to encode HtmlHelp index (hhk), content (hhc) and project file -# content. +# The CHM_INDEX_ENCODING is used to encode HtmlHelp index ( hhk), content ( hhc) +# and project file content. +# This tag requires that the tag GENERATE_HTMLHELP is set to YES. CHM_INDEX_ENCODING = -# If the GENERATE_HTMLHELP tag is set to YES, the BINARY_TOC flag -# controls whether a binary table of contents is generated (YES) or a -# normal table of contents (NO) in the .chm file. +# The BINARY_TOC flag controls whether a binary table of contents is generated ( +# YES) or a normal table of contents ( NO) in the .chm file. Furthermore it +# enables the Previous and Next buttons. +# The default value is: NO. +# This tag requires that the tag GENERATE_HTMLHELP is set to YES. BINARY_TOC = NO -# The TOC_EXPAND flag can be set to YES to add extra items for group members -# to the contents of the HTML help documentation and to the tree view. +# The TOC_EXPAND flag can be set to YES to add extra items for group members to +# the table of contents of the HTML help documentation and to the tree view. +# The default value is: NO. +# This tag requires that the tag GENERATE_HTMLHELP is set to YES. TOC_EXPAND = NO # If the GENERATE_QHP tag is set to YES and both QHP_NAMESPACE and -# QHP_VIRTUAL_FOLDER are set, an additional index file will be generated -# that can be used as input for Qt's qhelpgenerator to generate a -# Qt Compressed Help (.qch) of the generated HTML documentation. +# QHP_VIRTUAL_FOLDER are set, an additional index file will be generated that +# can be used as input for Qt's qhelpgenerator to generate a Qt Compressed Help +# (.qch) of the generated HTML documentation. +# The default value is: NO. +# This tag requires that the tag GENERATE_HTML is set to YES. GENERATE_QHP = NO -# If the QHG_LOCATION tag is specified, the QCH_FILE tag can -# be used to specify the file name of the resulting .qch file. -# The path specified is relative to the HTML output folder. +# If the QHG_LOCATION tag is specified, the QCH_FILE tag can be used to specify +# the file name of the resulting .qch file. The path specified is relative to +# the HTML output folder. +# This tag requires that the tag GENERATE_QHP is set to YES. QCH_FILE = -# The QHP_NAMESPACE tag specifies the namespace to use when generating -# Qt Help Project output. For more information please see -# http://doc.trolltech.com/qthelpproject.html#namespace +# The QHP_NAMESPACE tag specifies the namespace to use when generating Qt Help +# Project output. For more information please see Qt Help Project / Namespace +# (see: http://qt-project.org/doc/qt-4.8/qthelpproject.html#namespace). +# The default value is: org.doxygen.Project. +# This tag requires that the tag GENERATE_QHP is set to YES. QHP_NAMESPACE = -# The QHP_VIRTUAL_FOLDER tag specifies the namespace to use when generating -# Qt Help Project output. For more information please see -# http://doc.trolltech.com/qthelpproject.html#virtual-folders +# The QHP_VIRTUAL_FOLDER tag specifies the namespace to use when generating Qt +# Help Project output. For more information please see Qt Help Project / Virtual +# Folders (see: http://qt-project.org/doc/qt-4.8/qthelpproject.html#virtual- +# folders). +# The default value is: doc. +# This tag requires that the tag GENERATE_QHP is set to YES. QHP_VIRTUAL_FOLDER = doc -# If QHP_CUST_FILTER_NAME is set, it specifies the name of a custom filter to -# add. For more information please see -# http://doc.trolltech.com/qthelpproject.html#custom-filters +# If the QHP_CUST_FILTER_NAME tag is set, it specifies the name of a custom +# filter to add. For more information please see Qt Help Project / Custom +# Filters (see: http://qt-project.org/doc/qt-4.8/qthelpproject.html#custom- +# filters). +# This tag requires that the tag GENERATE_QHP is set to YES. QHP_CUST_FILTER_NAME = -# The QHP_CUST_FILT_ATTRS tag specifies the list of the attributes of the -# custom filter to add. For more information please see -# -# Qt Help Project / Custom Filters. +# The QHP_CUST_FILTER_ATTRS tag specifies the list of the attributes of the +# custom filter to add. For more information please see Qt Help Project / Custom +# Filters (see: http://qt-project.org/doc/qt-4.8/qthelpproject.html#custom- +# filters). +# This tag requires that the tag GENERATE_QHP is set to YES. QHP_CUST_FILTER_ATTRS = # The QHP_SECT_FILTER_ATTRS tag specifies the list of the attributes this -# project's -# filter section matches. -# -# Qt Help Project / Filter Attributes. +# project's filter section matches. Qt Help Project / Filter Attributes (see: +# http://qt-project.org/doc/qt-4.8/qthelpproject.html#filter-attributes). +# This tag requires that the tag GENERATE_QHP is set to YES. QHP_SECT_FILTER_ATTRS = -# If the GENERATE_QHP tag is set to YES, the QHG_LOCATION tag can -# be used to specify the location of Qt's qhelpgenerator. -# If non-empty doxygen will try to run qhelpgenerator on the generated -# .qhp file. +# The QHG_LOCATION tag can be used to specify the location of Qt's +# qhelpgenerator. If non-empty doxygen will try to run qhelpgenerator on the +# generated .qhp file. +# This tag requires that the tag GENERATE_QHP is set to YES. QHG_LOCATION = -# If the GENERATE_ECLIPSEHELP tag is set to YES, additional index files -# will be generated, which together with the HTML files, form an Eclipse help -# plugin. To install this plugin and make it available under the help contents -# menu in Eclipse, the contents of the directory containing the HTML and XML -# files needs to be copied into the plugins directory of eclipse. The name of -# the directory within the plugins directory should be the same as -# the ECLIPSE_DOC_ID value. After copying Eclipse needs to be restarted before -# the help appears. +# If the GENERATE_ECLIPSEHELP tag is set to YES, additional index files will be +# generated, together with the HTML files, they form an Eclipse help plugin. To +# install this plugin and make it available under the help contents menu in +# Eclipse, the contents of the directory containing the HTML and XML files needs +# to be copied into the plugins directory of eclipse. The name of the directory +# within the plugins directory should be the same as the ECLIPSE_DOC_ID value. +# After copying Eclipse needs to be restarted before the help appears. +# The default value is: NO. +# This tag requires that the tag GENERATE_HTML is set to YES. GENERATE_ECLIPSEHELP = NO -# A unique identifier for the eclipse help plugin. When installing the plugin -# the directory name containing the HTML and XML files should also have -# this name. +# A unique identifier for the Eclipse help plugin. When installing the plugin +# the directory name containing the HTML and XML files should also have this +# name. Each documentation set should have its own identifier. +# The default value is: org.doxygen.Project. +# This tag requires that the tag GENERATE_ECLIPSEHELP is set to YES. ECLIPSE_DOC_ID = org.doxygen.Project -# The DISABLE_INDEX tag can be used to turn on/off the condensed index (tabs) -# at top of each HTML page. The value NO (the default) enables the index and -# the value YES disables it. Since the tabs have the same information as the -# navigation tree you can set this option to NO if you already set -# GENERATE_TREEVIEW to YES. +# If you want full control over the layout of the generated HTML pages it might +# be necessary to disable the index and replace it with your own. The +# DISABLE_INDEX tag can be used to turn on/off the condensed index (tabs) at top +# of each HTML page. A value of NO enables the index and the value YES disables +# it. Since the tabs in the index contain the same information as the navigation +# tree, you can set this option to YES if you also set GENERATE_TREEVIEW to YES. +# The default value is: NO. +# This tag requires that the tag GENERATE_HTML is set to YES. DISABLE_INDEX = NO # The GENERATE_TREEVIEW tag is used to specify whether a tree-like index -# structure should be generated to display hierarchical information. -# If the tag value is set to YES, a side panel will be generated -# containing a tree-like index structure (just like the one that -# is generated for HTML Help). For this to work a browser that supports -# JavaScript, DHTML, CSS and frames is required (i.e. any modern browser). -# Windows users are probably better off using the HTML help feature. -# Since the tree basically has the same information as the tab index you -# could consider to set DISABLE_INDEX to NO when enabling this option. +# structure should be generated to display hierarchical information. If the tag +# value is set to YES, a side panel will be generated containing a tree-like +# index structure (just like the one that is generated for HTML Help). For this +# to work a browser that supports JavaScript, DHTML, CSS and frames is required +# (i.e. any modern browser). Windows users are probably better off using the +# HTML help feature. Via custom stylesheets (see HTML_EXTRA_STYLESHEET) one can +# further fine-tune the look of the index. As an example, the default style +# sheet generated by doxygen has an example that shows how to put an image at +# the root of the tree instead of the PROJECT_NAME. Since the tree basically has +# the same information as the tab index, you could consider setting +# DISABLE_INDEX to YES when enabling this option. +# The default value is: NO. +# This tag requires that the tag GENERATE_HTML is set to YES. GENERATE_TREEVIEW = NO -# The ENUM_VALUES_PER_LINE tag can be used to set the number of enum values -# (range [0,1..20]) that doxygen will group on one line in the generated HTML -# documentation. Note that a value of 0 will completely suppress the enum -# values from appearing in the overview section. +# The ENUM_VALUES_PER_LINE tag can be used to set the number of enum values that +# doxygen will group on one line in the generated HTML documentation. +# +# Note that a value of 0 will completely suppress the enum values from appearing +# in the overview section. +# Minimum value: 0, maximum value: 20, default value: 4. +# This tag requires that the tag GENERATE_HTML is set to YES. ENUM_VALUES_PER_LINE = 4 -# If the treeview is enabled (see GENERATE_TREEVIEW) then this tag can be -# used to set the initial width (in pixels) of the frame in which the tree -# is shown. +# If the treeview is enabled (see GENERATE_TREEVIEW) then this tag can be used +# to set the initial width (in pixels) of the frame in which the tree is shown. +# Minimum value: 0, maximum value: 1500, default value: 250. +# This tag requires that the tag GENERATE_HTML is set to YES. TREEVIEW_WIDTH = 250 -# When the EXT_LINKS_IN_WINDOW option is set to YES doxygen will open -# links to external symbols imported via tag files in a separate window. +# When the EXT_LINKS_IN_WINDOW option is set to YES doxygen will open links to +# external symbols imported via tag files in a separate window. +# The default value is: NO. +# This tag requires that the tag GENERATE_HTML is set to YES. EXT_LINKS_IN_WINDOW = NO -# Use this tag to change the font size of Latex formulas included -# as images in the HTML documentation. The default is 10. Note that -# when you change the font size after a successful doxygen run you need -# to manually remove any form_*.png images from the HTML output directory -# to force them to be regenerated. +# Use this tag to change the font size of LaTeX formulas included as images in +# the HTML documentation. When you change the font size after a successful +# doxygen run you need to manually remove any form_*.png images from the HTML +# output directory to force them to be regenerated. +# Minimum value: 8, maximum value: 50, default value: 10. +# This tag requires that the tag GENERATE_HTML is set to YES. FORMULA_FONTSIZE = 10 # Use the FORMULA_TRANPARENT tag to determine whether or not the images -# generated for formulas are transparent PNGs. Transparent PNGs are -# not supported properly for IE 6.0, but are supported on all modern browsers. -# Note that when changing this option you need to delete any form_*.png files -# in the HTML output before the changes have effect. +# generated for formulas are transparent PNGs. Transparent PNGs are not +# supported properly for IE 6.0, but are supported on all modern browsers. +# +# Note that when changing this option you need to delete any form_*.png files in +# the HTML output directory before the changes have effect. +# The default value is: YES. +# This tag requires that the tag GENERATE_HTML is set to YES. FORMULA_TRANSPARENT = YES -# Enable the USE_MATHJAX option to render LaTeX formulas using MathJax -# (see http://www.mathjax.org) which uses client side Javascript for the -# rendering instead of using prerendered bitmaps. Use this if you do not -# have LaTeX installed or if you want to formulas look prettier in the HTML -# output. When enabled you may also need to install MathJax separately and -# configure the path to it using the MATHJAX_RELPATH option. +# Enable the USE_MATHJAX option to render LaTeX formulas using MathJax (see +# http://www.mathjax.org) which uses client side Javascript for the rendering +# instead of using prerendered bitmaps. Use this if you do not have LaTeX +# installed or if you want to formulas look prettier in the HTML output. When +# enabled you may also need to install MathJax separately and configure the path +# to it using the MATHJAX_RELPATH option. +# The default value is: NO. +# This tag requires that the tag GENERATE_HTML is set to YES. USE_MATHJAX = NO # When MathJax is enabled you can set the default output format to be used for -# the MathJax output. Supported types are HTML-CSS, NativeMML (i.e. MathML) and -# SVG. The default value is HTML-CSS, which is slower, but has the best -# compatibility. +# the MathJax output. See the MathJax site (see: +# http://docs.mathjax.org/en/latest/output.html) for more details. +# Possible values are: HTML-CSS (which is slower, but has the best +# compatibility), NativeMML (i.e. MathML) and SVG. +# The default value is: HTML-CSS. +# This tag requires that the tag USE_MATHJAX is set to YES. MATHJAX_FORMAT = HTML-CSS -# When MathJax is enabled you need to specify the location relative to the -# HTML output directory using the MATHJAX_RELPATH option. The destination -# directory should contain the MathJax.js script. For instance, if the mathjax -# directory is located at the same level as the HTML output directory, then -# MATHJAX_RELPATH should be ../mathjax. The default value points to -# the MathJax Content Delivery Network so you can quickly see the result without -# installing MathJax. -# However, it is strongly recommended to install a local -# copy of MathJax from http://www.mathjax.org before deployment. +# When MathJax is enabled you need to specify the location relative to the HTML +# output directory using the MATHJAX_RELPATH option. The destination directory +# should contain the MathJax.js script. For instance, if the mathjax directory +# is located at the same level as the HTML output directory, then +# MATHJAX_RELPATH should be ../mathjax. The default value points to the MathJax +# Content Delivery Network so you can quickly see the result without installing +# MathJax. However, it is strongly recommended to install a local copy of +# MathJax from http://www.mathjax.org before deployment. +# The default value is: http://cdn.mathjax.org/mathjax/latest. +# This tag requires that the tag USE_MATHJAX is set to YES. MATHJAX_RELPATH = http://cdn.mathjax.org/mathjax/latest -# The MATHJAX_EXTENSIONS tag can be used to specify one or MathJax extension -# names that should be enabled during MathJax rendering. +# The MATHJAX_EXTENSIONS tag can be used to specify one or more MathJax +# extension names that should be enabled during MathJax rendering. For example +# MATHJAX_EXTENSIONS = TeX/AMSmath TeX/AMSsymbols +# This tag requires that the tag USE_MATHJAX is set to YES. MATHJAX_EXTENSIONS = -# The MATHJAX_CODEFILE tag can be used to specify a file with javascript -# pieces of code that will be used on startup of the MathJax code. +# The MATHJAX_CODEFILE tag can be used to specify a file with javascript pieces +# of code that will be used on startup of the MathJax code. See the MathJax site +# (see: http://docs.mathjax.org/en/latest/output.html) for more details. For an +# example see the documentation. +# This tag requires that the tag USE_MATHJAX is set to YES. MATHJAX_CODEFILE = -# When the SEARCHENGINE tag is enabled doxygen will generate a search box -# for the HTML output. The underlying search engine uses javascript -# and DHTML and should work on any modern browser. Note that when using -# HTML help (GENERATE_HTMLHELP), Qt help (GENERATE_QHP), or docsets -# (GENERATE_DOCSET) there is already a search function so this one should -# typically be disabled. For large projects the javascript based search engine -# can be slow, then enabling SERVER_BASED_SEARCH may provide a better solution. +# When the SEARCHENGINE tag is enabled doxygen will generate a search box for +# the HTML output. The underlying search engine uses javascript and DHTML and +# should work on any modern browser. Note that when using HTML help +# (GENERATE_HTMLHELP), Qt help (GENERATE_QHP), or docsets (GENERATE_DOCSET) +# there is already a search function so this one should typically be disabled. +# For large projects the javascript based search engine can be slow, then +# enabling SERVER_BASED_SEARCH may provide a better solution. It is possible to +# search using the keyboard; to jump to the search box use + S +# (what the is depends on the OS and browser, but it is typically +# , /