server.cc: In member function ‘bool ServerState::HeaderLine(std::string)’:
server.cc:198:72: warning: format ‘%llu’ expects argument of type ‘long long unsigned int*’, but argument 3 has type ‘long long int*’ [-Wformat=]
else if (sscanf(Val.c_str(),"bytes %llu-%*u/%llu",&StartPos,&Size) != 2)
support DEB_BUILD_PROFILES and -P for build profiles
Inspired by the rest of the patch in 661537, but abstract the
parsing of various ways of setting the build profiles more so it can
potentially be reused and all apt parts have the same behaviour.
Especially config options, cmdline options and environment will not be
combined as proposed as this isn't APTs usual behaviour and dpkg doesn't
do it either, so one overrides the other as it normally does.
Johannes Schauer [Mon, 24 Feb 2014 23:12:20 +0000 (00:12 +0100)]
implement BuildProfileSpec support as dpkg has in 1.17.2
Build-dependencies are now able to include a <profile.foo …>
specification limiting usage similar to already supported [arch …].
More details: https://wiki.debian.org/BuildProfileSpec
add default and override handling for Cnf::FindVector
Automatically handle the override of list options via its parent value
which can even be a comma-separated list of values. It also adds an easy
way of providing a default for the list.
It can be useful to have a whole makefile available for vendor setup,
but by providing a basic one we can deal with the simple cases more
easily (and changes to the system are presumably easier).
Prevents that "old" dependencies have an influence in the scoring.
With positive dependencies this is usually not a problem, but negative
dependencies can linger around for a long time.
propagate a negative score point along breaks/conflicts
versioned -dev packages like db and boost have the problem of no
dependencies which would give them a competitive advantage against an
older incarnation of the -dev package, so they tend to be kept back
until the old version is removed from the archive, which, if the user
has older releases in its sources can take a long time (or never happens).
The newer version has a conflicts/breaks against the older one, but the
older one hasn't against the newer, so by giving via the conflicts the
older one a reduced score the newer one can win if there is no other
reason to keep it. If both have a conflict against each other the
scoring will cancel itself out, so no harm done.
This gives "action" a slightly bigger edge in breaks/conflicts cases
than before, but holding back isn't a really good solution anyway.
Guillem Jover [Sun, 16 Feb 2014 22:30:48 +0000 (23:30 +0100)]
Add support for data.tar, control.tar and control.tar.xz
Sync the deb(5) format support with latest dpkg, by allowing
uncompressed tar members and xz compressed control.tar. This
also refactors the control.tar member extraction by using
ExtractTarMember(), which also means future changes only need
to be implemented in a single place.
Michael Vogt [Fri, 14 Feb 2014 18:58:56 +0000 (19:58 +0100)]
disable fnmatch()
The current PackageContainerInterface::FromString() will do a
FromFnmatch() first and then FromRegEx(). This commit reverts
that change to restore the old behavior to only look for RegEx
and not glob-style pattern. The rational is that:
a) currently a fnmatch() is misleadingly reported as a regex match to
the user (Bug#738880)
b) a fnmatch may match something different than a a RegEx so the
change broke a published interface
Commit 6008b79adf1d7ea5607fab87a355d664c8725026 should have been guarded
by "Git-Dch: Ignore", but it wasn't and I only noticed it with the Close
message via deity thinking "hehe, I wonder if someone is gonna notice".
Looks like someone did: hats off to reddit user itisOmegakai!
Good to know that what I do isn't only monitored by goverments. :)
As there is another instance of basically the same code we just factor
out the code a bit and reuse, so its even cleaner and not only simpler.
do not compress .xhtml files and remove junk files
dh_compress compresses .xhtml files by default, which breaks our doxygen
documentation. doxygen has also a bunch of temporary files it creates
which stay in the build directory and so we remove them before
installing them as documentation.
switch protocols at random is a bad idea if e.g. http can switch to
file, so we limit the possibilities to http to http and http to https.
As very few people (less than 1% according to popcon) have https
installed this likely changes nothing in terms of failure. The commit is
adding a friendly hint which package needs to be installed though.
The current description says:
"Many users find dselect intimidating and new users may prefer to use
apt-based user interfaces."
It doesn't feel right to refer users to it then.
Michael Vogt [Wed, 12 Feb 2014 06:59:07 +0000 (07:59 +0100)]
Use a APT::VersionSet instead of a VersionList
Use a APT::VersionSet instead of a APT::VersionList in DoDownload()
to ensure that there is only one version in the set even if the
user passes multiple identical name/versions on the commandline
(Bug#738103)
John Ogness [Fri, 13 Dec 2013 19:59:31 +0000 (20:59 +0100)]
apt-cdrom should succeed if any drive succeeds
If there are multiple CD-ROM drives, `apt-cdrom add` will abort with an
error if any of the drives do not contain a Debian CD which is against
the documentation we have saying "a CD-ROM" and also scripts do not
expect it this way.
This patch modifies apt-cdrom to return success if any of the drives
succeeded. If failures occur, apt-cdrom will still continue trying all
the drives and report the last failure (if none of them succeeded).
The 'ident' command was also changed to match the new 'add' behavior.
cppcheck complains about the obsolete utime as it was removed in
POSIX1.2008 and recommends usage of utimensat/futimens instead
as those are in POSIX and so commit 9ce3cfc9 switched to them.
It is just that they aren't as portable as the standard suggests:
At least our kFreeBSD and Hurd ports stumble over it at runtime.
So to make both, the ports and cppcheck happy, we use utimes instead.
With APT::Get::List-Cleanup disabled the ed-style patch files are
lingering in the lists/ directory otherwise. That was kinda okay in the
old none-client-merge as the filename was always the same so it was
constantly overridden, but now with different names for client-merge
quiet a few could pill up on the system and are used by the next call
as it picks them up based on the filename.
Very few methods we have are documented, so this is A LOT of noise
hidden the "interesting" warnings about methods which are documented,
but incorrectly and such stuff.
Does the same as before, but is a bit simpler on the logic for humans as
well as compilers. scan-build complained about it at least with:
"Result of operation is garbage or undefined"
Colin Watson [Thu, 30 Jan 2014 14:08:08 +0000 (14:08 +0000)]
multicompress with externals sets wrong file modes
Copy from the bug description:
After we upgraded the Ubuntu master archive from lucid to precise, we
noticed that Translation-en.bz2 was being written with mode 0600 rather
than 0644, which broke our mirroring. This is no longer reproducible as
such in unstable because apt now links against libbz2, but it's still
reproducible with xz; it happens because multicompress fchmods one end
of the compression pipe in this case rather than the target file.
[Original testcase slightly modified to comply with house-style]
If a (Pre-)Depends can't be satisfied there is no point in keeping the
candidate as is as it is impossible to find a solution for it, so we can
just as well reset the candidate to the currently installed version.
We avoid trying to install this impossible candidate later on this way.
The offset variable in DebSrcRecordParser was not initialized which we
now do and based on it do not trigger a restart if the parser was not
used yet avoiding a needless rescan of the section.
Detected while working on the previous commit e62aa1dd. Both commits act
as a "fix" for the bug shown in the testcase of the commit – this one
here would only hide it through.
pkgTagFile: if we have seen the end, do not try to see more
Asking for more via Step() will notice that we are done with the file
already and will result in a fail, which means we can't find the last
sections anymore (which is especially painful if we haven't moved at
all as in the testcase we haven't even looked at one of the sources
leading to a strange behaviour)