Search Results

Search found 2952 results on 119 pages for 'dependencies'.

Page 12/119 | < Previous Page | 8 9 10 11 12 13 14 15 16 17 18 19  | Next Page >

  • not able to install g++ and gcc on debian

    - by austin powers
    Hi , I want to use directadmin as my web control panel and it needs several packages like g++ , gcc and etc... as usuall I started to type apt-get install g++ and there problems start : dependecy error... then I tried to apt-get -f install and I got this error (Reading database ... 15140 files and directories currently installed.) Removing libc6-xen ... ldconfig: /etc/ld.so.conf.d/libc6-xen.conf:6: hwcap index 0 already defined as nosegneg dpkg: error processing libc6-xen (--remove): subprocess post-removal script returned error exit status 1 Errors were encountered while processing: libc6-xen E: Sub-process /usr/bin/dpkg returned an error code (1) what shoud I do? I want to install g++ and all of its dependencies due to using of directadmin I need it. regards.

    Read the article

  • Installing cURL on Ubuntu

    - by davykiash
    Am trying to install cURL on my ubuntu server using the command sudo apt-get install php5-curl However i get the following error Reading package lists... Done Building dependency tree Reading state information... Done Some packages could not be installed. This may mean that you have requested an impossible situation or if you are using the unstable distribution that some required packages have not yet been created or been moved out of Incoming. The following information may help to resolve the situation: The following packages have unmet dependencies: php5-curl: Depends: phpapi-20060613+lfs Depends: php5-common (= 5.2.6.dfsg.1-3ubuntu4.5) but 5.3.2-0.dotdeb.1 is to be installed E: Broken packages I am running PHP Version 5.3.2-0.dotdeb.1 on my server. Whats the issue? I need to get curl up and running.

    Read the article

  • Error in Jboss 5- DEPLOYMENTS MISSING DEPENDENCIES

    - by Nila
    Hi! I'm new to ejb3. I tried deploying a simple war file.. But, I'm getting the following error. Plz, help me... DEPLOYMENTS MISSING DEPENDENCIES: Deployment "jboss.j2ee:jar=EJB3-war.war,name=Statelessbean,service=EJB3_endpoint" is missing the following dependencies: Dependency "jboss.j2ee:jar=EJB3-war.war,name=Statelessbean,service=EJB3" (should be in state "Configured", but is actually in state "** NOT FOUND Depends on 'jboss.j2ee:jar=EJB3-war.war,name=Statelessbean,service=EJB3' **") DEPLOYMENTS IN ERROR: Deployment "jboss.j2ee:jar=EJB3-war.war,name=Statelessbean,service=EJB3" is in error due to the following reason(s): ** NOT FOUND Depends on 'jboss.j2ee:jar=EJB3-war.war,name=Statelessbean,service=EJB3' ** Deployment "vfszip:/D:/Softwares/jboss-5.1.0.GA/server/default/deploy/EJB3.ear/" is in error due to the following reason(s): java.lang.IllegalStateException: jboss.web.deployment:war=/EJB3-war is already installed. Thanks in advance

    Read the article

  • managing library dependencies with Boost.Build and C++

    - by user931794
    I want to develop a project which can be built on a bunch of different platforms. The project code will be in C++, what's the the best way to manage libraries? I plan on using bjam as the build system because I'm going to be depending on Boost and their unit testing framework as well. The two dependent libraries are Boost itself and FLTK. The possibilities that come to mind for library management are: include build artifacts (binaries) and headers for all supported platforms in-tree include complete source for all dependent libraries in-tree, and somehow script them as dependencies A combination of 1 and 2, like node.js does with v8 inform the user that they need to build the libraries themselves and then have them on the PATH or in some special directory, like libcurl does with its dependencies What is the best approach here? The project will probably not grow beyond a few thousand lines over the next six months, but I want to make the right choice here so that I don't have to come back and switch everything around later.

    Read the article

  • hackage package dependencies and future-proof libraries

    - by yairchu
    In the dependencies section of a cabal file: Build-Depends: base >= 3 && < 5, transformers >= 0.2.0 Should I be doing something like Build-Depends: base >= 3 && < 5, transformers >= 0.2.0 && < 0.3.0 (putting upper limits on versions of packages I depend on) or not? I'll use a real example: my "List" package on Hackage (List monad transformer and class) If I don't put the limit - my package could break by a change in "transformers" If I do put the limit - a user that uses "transformers" but is using a newer version of it will not be able to use lift and liftIO with ListT because it's only an instance of these classes of transformers-0.2.x I guess that applications should always put upper limits so that they never break, so this question is only about libraries: Shall I use the upper version limit on dependencies or not?

    Read the article

  • Dealing with dependencies between WCF services when using Castle Windsor

    - by Georgia Brown
    I have several WCF services which use castle windsor to resolve their dependencies. Now I need some of these services to talk to each other. The typical structure is service -- Business Logic -- DAL The calls to the other services need to occur at Business Logic level. What is the best approach for implementing this? Should I simply inject a service proxy into the business logic? Is this wasteful if for example, only one of two method from my service need to use this proxy? What if the services need to talk to each other? - Will castle windsor get stuck in a loop trying to resolve each services dependencies?

    Read the article

  • Assembly Load and loading the "sub-modules" dependencies - "cannot find the file specified"

    - by Ted
    There are several questions out there that ask the same question. However the answers they received I cannot understand, so here goes: Similar questions: http://stackoverflow.com/questions/1874277/dynamically-load-assembly-and-manually-force-path-to-get-referenced-assemblies ; http://stackoverflow.com/questions/22012/loading-assemblies-and-its-dependencies-closed The question in short: I need to figure out how dependencies, ie References in my modules can be loaded dynamically. Right now I am getting "The system cannot find the file specified" on Assemblies referenced in my so called modules. I cannot really get how to use the AssemblyResolve event... The longer version I have one application, MODULECONTROLLER, that loads separate modules. These "separate modules" are located in well-known subdirectories, like appBinDir\Modules\Module1 appBinDir\Modules\Module2 Each directory contains all the DLLs that exists in the bin-directory of those projects after a build. So the MODULECONTROLLER loads all the DLLs contained in those folders using this code: byte[] bytes = File.ReadAllBytes(dllFileFullPath); Assembly assembly = null; assembly = Assembly.Load(bytes); I am, as you can see, loading the byte[]-array (so I dont lock the DLL-files). Now, in for example MODULE1, I have a static reference called MyGreatXmlProtocol. The MyGreatXmlProtocol.dll then also exists in the directory appBinDir\Modules\Module1 and is loaded using the above code When code in the MODULE1 tries to use this MyGreatXmlProtocol, I get: Could not load file or assembly 'MyGreatXmlProtocol, Version=1.0.3797.26527, Culture=neutral, PublicKeyToken=null' or one of its dependencies. The system cannot find the file specified. So, in a post (like this one) they say that To my understanding reflection will load the main assembly and then search the GAC for the referenced assemblies, if it cannot find it there, you can then incorparate an assemblyResolve event: First; is it really needed to use the AssemblyResolve-event to make this work? Shouldnt my different MODULEs themself load their DLLs, as they are statically referenced? Second; if AssemblyResolve is the way to go - how do I use it? I have attached a handler to the Event but I never get anything on MyGreatXmlProctol... === EDIT === CODE regarding the AssemblyResolve-event handler: public GUI() { InitializeComponent(); AppDomain.CurrentDomain.AssemblyResolve += new ResolveEventHandler(CurrentDomain_AssemblyResolve); ... } // Assembly CurrentDomain_AssemblyResolve(object sender, ResolveEventArgs args) { Console.WriteLine(args.Name); return null; } Hope I wasnt too fuzzy =) Thx

    Read the article

  • Assembly Load and loading the "sub-modules" dependencies - "cannot fild the file specified"

    - by Ted
    There are several questions out there that ask the same question. However the answers they received I cannot understand, so here goes: Similar questions: http://stackoverflow.com/questions/1874277/dynamically-load-assembly-and-manually-force-path-to-get-referenced-assemblies ; http://stackoverflow.com/questions/22012/loading-assemblies-and-its-dependencies-closed The question in short: I need to figure out how dependencies, ie References in my modules can be loaded dynamically. Right now I am getting "The system cannot find the file specified" on Assemblies referenced in my so called modules. I cannot really get how to use the AssemblyResolve event... The longer version I have one application, MODULECONTROLLER, that loads separate modules. These "separate modules" are located in well-known subdirectories, like appBinDir\Modules\Module1 appBinDir\Modules\Module2 Each directory contains all the DLLs that exists in the bin-directory of those projects after a build. So the MODULECONTROLLER loads all the DLLs contained in those folders using this code: byte[] bytes = File.ReadAllBytes(dllFileFullPath); Assembly assembly = null; assembly = Assembly.Load(bytes); I am, as you can see, loading the byte[]-array (so I dont lock the DLL-files). Now, in for example MODULE1, I have a static reference called MyGreatXmlProtocol. The MyGreatXmlProtocol.dll then also exists in the directory appBinDir\Modules\Module1 and is loaded using the above code When code in the MODULE1 tries to use this MyGreatXmlProtocol, I get: Could not load file or assembly 'MyGreatXmlProtocol, Version=1.0.3797.26527, Culture=neutral, PublicKeyToken=null' or one of its dependencies. The system cannot find the file specified. So, in a post (like this one) they say that To my understanding reflection will load the main assembly and then search the GAC for the referenced assemblies, if it cannot find it there, you can then incorparate an assemblyResolve event: First; is it really needed to use the AssemblyResolve-event to make this work? Shouldnt my different MODULEs themself load their DLLs, as they are statically referenced? Second; if AssemblyResolve is the way to go - how do I use it? I have attached a handler to the Event but I never get anything on MyGreatXmlProctol... === EDIT === CODE regarding the AssemblyResolve-event handler: public GUI() { InitializeComponent(); AppDomain.CurrentDomain.AssemblyResolve += new ResolveEventHandler(CurrentDomain_AssemblyResolve); ... } // Assembly CurrentDomain_AssemblyResolve(object sender, ResolveEventArgs args) { Console.WriteLine(args.Name); return null; } Hope I wasnt too fuzzy =) Thx

    Read the article

  • How to prevent maven to resolve dependencies in local repository

    - by Nils Schmidt
    Is there a way to tell maven (when doing mvn package, mvn site or ...) not to resolve the dependencies from the local repository? Background of this question: Sometimes I get into problems, when previously cached dependencies (e.g. SomeProject-0.7-ALPHA) are no longer available in the remote repository. In my local build everything still works fine as the dependency has been cached before. As soon as I share my pom with others, they may get into trouble, as they dont have a cached version of that dependency and the dependency can no longer be resolved from the remote repository. Any help will be appreciated. Thanks in advance!

    Read the article

  • Handling hundreds of dependencies with ant

    - by Roberto
    Hi guys, I have to refactor an ant xml file. Basicly I have one big task that checkouts (using cvs) a lot of dependencies, build them, and then copy all the jar/wsdl generated by building them to a directory that I specify. If one dependency version changes, I have to change the name in at least 3 places on the xml file (cvs checkout, build, copy). What I'd like to have is just a single place where I can specify my dependencies name, without having to search & replace the dependency name through the code. One of the problems is that the cvs project could be /path1/path2/project with tag=v12 but then the jars generated by the single project build could be several with different names, so it seems to be a bit complicated. Do you have any idea on how I can get this done?

    Read the article

  • Eclipse, Android ndk, source files, and library project dependencies

    - by Android Noob
    In Microsoft Visual Studio 2010, it is possible to create a Solution with multiple projects and set dependencies between projects. I'm trying to figure out if the same thing can be done using Eclipse via the NDK. More specifically, I want to know if it is possible to create C source files in an ordinary Android project that can reference C header files in an Android library project. For example: Android library project: Sockets Ordinary Android project: Socket_Server Sockets contains all the C header/source files that are needed to do socket I/O. Socket_Server contains test code that makes calls to the functions that are defined in Sockets library project. This test code requires a header file that contains the function declaration of all API calls. I already set the library dependencies between the projects via: Properties > Android > Library > Add

    Read the article

  • Python in AWS Elastic Beasntalk: Private package dependencies

    - by Adam Matan
    I would like to deploy a Python Flask application on beanstalk. The application depends on external packages (e.g. geopy) and internal packages (e.g. adam_geography). The manual Create a requirements.txt file and place it in the top-level directory of your source bundle. This would probably fetch geopy and its dependencies, but would not fetch adam_geography which is available from a custom repo inside my VPC. How do I specify/upload private, internal Python package dependencies in a Beanstalk application?

    Read the article

  • Java EE Module Dependencies in web project?

    - by Sled
    Hey guys, I have this webprojec to which I have to add a jar from another EJB project. Normally I'd right-click the webproject and go to properties - Java EE Module Dependencies. I don't know if it is because I upgraded eclipse, or i'm doing something wrong, but I just can't find the "Java EE Module Dependencies" in the properties window. Both projects are linked with the same EAR so the EJB jar file shuld be there! Any ideas what I'm doing wrong or some other way I could attach the EJB's JAR file? I'm only allowed to work with eclipse, so netbeans is not an option. Thanks! EDIT: basically, this is what I want to do, but that specific panel won't show up...

    Read the article

  • DEB: "Provides:" field ignored

    - by Creshal
    I need to replace a package with a custom one, which gets its own name (foo-origpackage). To allow it to be used as drop-in replacement, I added the Provides: origpackage line to the control file. apt-cache show foo-origpackage lists the "Provides" entry just fine. However, when I want to install a file depending on origpackage, it fails ("Package origpackage not installed"). Is there some distinction between "real" and virtual packages I'm missing? EDIT: To be precise, what I want to replace is xen-utils-common for Squeeze. My tao-xen-utils-common has the following control file: Source: tao-xen-utils-common Section: kernel Priority: optional Maintainer: Creshal <[email protected]> Build-Depends: debhelper Standards-Version: 3.8.0 Homepage: http://tao.at Package: tao-xen-utils-common Architecture: all Depends: gawk, lsb-base, udev, xenstore-utils, tao-firewall Provides: xen-utils-common Conflicts: xen-utils-common Replaces: xen-utils-common Description: Xen administrative tools - common files (modified) The userspace tools to manage a system virtualized through the Xen virtual machine monitor. Modified for use with TAO Firewall. Installing xen-utils-4.0 fails, however: foo@bar# apt-cache showpkg tao-xen-utils-common Package: tao-xen-utils-common Versions: 4.0.0-1tao1 (/var/lib/apt/lists/repo.tao.at_dists_stable_main_binary-amd64_Packages) (/var/lib/dpkg/status) Description Language: File: /var/lib/apt/lists/repo.tao.at_dists_stable_main_binary-amd64_Packages MD5: 7c2503f563fca13b33b4eb3cbcb3c129 Reverse Depends: tao-firewall,tao-xen-utils-common tao-firewall,tao-xen-utils-common Dependencies: 4.0.0-1tao1 - gawk (0 (null)) lsb-base (0 (null)) udev (0 (null)) xenstore-utils (0 (null)) tao-firewall (0 (null)) xen-utils-common (0 (null)) xen-utils-common (0 (null)) Provides: 4.0.0-1tao1 - xen-utils-common Reverse Provides: foo@bar# apt-get install xen-utils-4.0 Reading package lists... Done Building dependency tree Reading state information... Done The following extra packages will be installed: xen-utils-common Suggested packages: xen-docs-4.0 The following packages will be REMOVED: tao-xen-utils-common The following NEW packages will be installed: xen-utils-4.0 xen-utils-common Edit:foo@bar# apt-cache policy xen-utils-4.0 xen-utils-4.0: Installed: (none) Candidate: 4.0.1-4 Version table: 4.0.1-4 0 500 http://ftp.at.debian.org/debian/ stable/main amd64 Packages 4.0.1-4 0 500 http://security.debian.org/ stable/updates/main amd64 Packages

    Read the article

  • Managing Internal Yum Repository Groups

    - by elmt
    What is the best method for handling yum groups dependencies? For example, take this comps.xml file <comps> <group> <id>production</id> <name>Production</name> <default>true</default> <description>Packages required to run</description> <uservisible>true</uservisible> <packagelist> <packagereq type="default">ssh</packagereq> </packagelist> </group> <group> <id>development</id> <name>Development</name> <default>false</default> <description>Packages required to develop</description> <uservisible>true</uservisible> <packagelist> <packagereq type="default">gcc</packagereq> </packagelist> </group> </comps> which is packaged with createrepo -g comps.xml x86_64. The ssh and gcc rpms are not installed in the x86_64 directory. If I run yum groupinstall development, yum is smart enough to pull the gcc package from the RHEL repo even though the groups are defined in my internal repository. However, is this the proper way of doing this, or should I copy the rpms to my local repository and recreate the repo?

    Read the article

  • Build System with Recursive Dependency Aggregation

    - by radman
    Hi, I recently began setting up my own library and projects using a cross platform build system (generates make files, visual studio solutions/projects etc on demand) and I have run into a problem that has likely been solved already. The issue that I have run into is this: When an application has a dependency that also has dependencies then the application being linked must link the dependency and also all of its sub-dependencies. This proceeds in a recursive fashion e.g. (For arguments sake lets assume that we are dealing exclusively with static libraries.) TopLevelApp.exe dependency_A dependency_A-1 dependency_A-2 dependency_B dependency_B-1 dependency_B-2 So in this example TopLevelApp will need to link dependency_A, dependency_A-1, dependency_A-2 etc and the same for B. I think the responsibility of remembering all of these manually in the target application is pretty sub optimal. There is also the issue of ensuring the same version of the dependency is used across all targets (assuming that some targets depend on the same things, e.g. boost). Now linking all of the libraries is required and there is no way of getting around it. What I am looking for is a build system that manages this for you. So all you have to do is specify that you depend on something and the appropriate dependencies of that library will be pulled in automatically. The build system I have been looking at is premake premake4 which doesn't handle this (as far as I can determine). Does anyone know of a build system that does handle this? and if there isn't then why not?

    Read the article

  • Ruby Gems Either/Or Dependency?

    - by Myron
    My VCR gem currently depends on FakeWeb. I have it declared as a dependency in my gemspec. I'm working with the auther of WebMock (a library that provides similar functionality to FakeWeb) to get VCR to work with WebMock as well, so that users of VCR could use either FakeWeb or WebMock as the http stubbing library. When it comes time to release the next version of VCR, I'm not sure of the best way to handle these dependencies. VCR will depend on either WebMock, or FakeWeb (but doesn't need both), and will have certain version requirements for both. I could add both as dependencies to my gemspec, but when you use bundler, it bundles all gem dependencies--so both FakeWeb and Webmock will get bundled with the application. I've been thinking that maybe I won't declare either gem as a dependency, and check for the presence of either library at run-time (along with checking the version), and give the user a helpful error message if neither is present at a supported version. But I'm not really sure I like this approach either. Does anyone have a suggestion for the best way to handle an either/or gem dependency? Thanks!

    Read the article

  • ASP.NET CacheDependency out of ThreadPool

    - by Stephen
    In an async http handler, we add items to the ASP.NET cache, with dependencies on some files. If the async method executes on a thread from the ThreadPool, all is fine: AsyncResult result = new AsyncResult(context, cb, extraData); ThreadPool.QueueUserWorkItem(new WaitCallBack(DoProcessRequest), result); But as soon as we try to execute on a thread out of the ThreadPool: AsyncResult result = new AsyncResult(context, cb, extraData); Runner runner = new Runner(result); Thread thread = new Thread(new ThreadStart(runner.Run()); ... where Runner.Run just invokes DoProcessRequest, The dependencies do trigger right after the thread exits. I.e. the items are immediately removed from the cache, the reason being the dependencies. We want to use an out-of-pool thread because the processing might take a long time. So obviously something's missing when we create the thread. We might need to propagate the call context, the http context... Has anybody already encountered that issue? Note: off-the-shelf custom threadpools probably solve this. Writing our own threadpool is probably a bad idea (think NIH syndrom). Yet I'd like to understand this in details, though.

    Read the article

  • Doubts regarding the behaviour of 'autoremove' command and '--auto-remove' flag

    - by Jasper Loy
    After reading several man pages and forums, I thought that running 'apt-get autoremove' without any following argument removes all unused dependencies left on the system, while running 'apt-get autoremove xxx' removes xxx together with its unused dependencies. However I found this to be not true. Running 'apt-get autoremove xxx' not only removes xxx together with its unused dependencies, it also removes all other unused dependencies. So I tried to run 'apt-get remove --auto-remove xxx', thinking that this would remove only xxx and its unused dependencies. To my surprise, this also removed xxx, its unused dependencies and all other unused dependencies. Is this the intended behaviour of the commands or a bug? Is there any quick way to remove xxx and its unused dependencies without removing other unused dependencies?

    Read the article

  • Can plugins loaded with MEF resolve their own internal dependencies with the same MEF container for

    - by Dave
    From my experimentation, I think the answer is "kind of", but I could have made a mistake. I have an application that loads appliance plugins with MEF. That part is working fine. Now let's say that my BlenderAppliance wants to resolve several of its dependencies with MEF, which each implement IApplianceFeature. I've just used the ImportMany attribute to my plugin. I made sure to create the plugin using MEF so that the Imports work properly. I said "kind of" because some of the plugin's internals (i.e. the model) are loading with MEF just fine, but the IApplianceFeatures aren't. The difference here is that the IApplianceFeatures are themselves, assemblies. And at the moment, they are in one folder above that of the plugin itself, i.e. + application folder | IApplianceFeature1.dll | IApplianceFeature2.dll +---+ plugin folder | BlenderAppliance.dll Now if my application uses an AggregateCatalog to load the "." and ".\plugins" folders, why doesn't it ever load the IApplianceFeature assemblies for me? Is it possible / advisable to have the plugin create its own MEF container to resolve its dependencies, or does really nasty stuff happen? If you have any stories about this scenario, please share. :)

    Read the article

  • MacPorts - Installing Port, Dependencies Failed

    - by Louis
    I am attempting to install xulrunner on OSX 10.6.3 using the following: sudo port install xulrunner However, I am receiving the following error: nat-10-200-136-126:phoneyc-new $ sudo port install xulrunner ---> Computing dependencies for xulrunner ---> Activating zlib @1.2.5_0 Error: The following dependencies failed to build: gconf dbus-glib glib2 zlib gtk-doc docbook-xml docbook-xml-4.1.2 xmlcatmgr docbook-xml-4.2 docbook-xml-4.3 docbook-xml-4.4 docbook-xml-4.5 docbook-xml-5.0 docbook-xsl gnome-doc-utils iso-codes libxslt libxml2 p5-xml-parser py26-libxml2 python26 bzip2 db46 gdbm openssl readline sqlite3 tk Xft2 fontconfig freetype xrender xorg-libX11 xorg-bigreqsproto xorg-inputproto xorg-kbproto xorg-libXau xorg-xproto xorg-libXdmcp xorg-util-macros xorg-xcmiscproto xorg-xextproto xorg-xf86bigfontproto xorg-xtrans xorg-renderproto tcl xorg-libXScrnSaver xorg-libXext xorg-scrnsaverproto rarian getopt intltool gnome-common p5-getopt-long p5-pathtools p5-scalar-list-utils gtk2 atk cairo libpixman libpng jasper jpeg pango shared-mime-info tiff xorg-libXcomposite xorg-compositeproto xorg-libXfixes xorg-fixesproto xorg-libXcursor xorg-libXdamage xorg-damageproto xorg-libXi xorg-libXinerama xorg-xineramaproto xorg-libXrandr xorg-randrproto orbit2 libidl policykit heimdal lcms libcanberra gstreamer bison flex gzip texinfo lzmautils libvorbis libogg libnotify nss xorg-libXt xorg-libsm xorg-libice Error: Status 1 encountered during processing. Before reporting a bug, first run the command again with the -d flag to get complete output. nat-10-200-136-126:phoneyc-new$ I am unsure how to correct this issue, so any help would be much appreciated!

    Read the article

  • Structuring projects & dependencies of large winforms applications in C#

    - by Benjol
    UPDATE: This is one of my most-visited questions, and yet I still haven't really found a satisfactory solution for my project. One idea I read in an answer to another question is to create a tool which can build solutions 'on the fly' for projects that you pick from a list. I have yet to try that though. How do you structure a very large application? Multiple smallish projects/assemblies in one big solution? A few big projects? One solution per project? And how do you manage dependencies in the case where you don't have one solution. Note: I'm looking for advice based on experience, not answers you found on Google (I can do that myself). I'm currently working on an application which has upward of 80 dlls, each in its own solution. Managing the dependencies is almost a full time job. There is a custom in-house 'source control' with added functionality for copying dependency dlls all over the place. Seems like a sub-optimum solution to me, but is there a better way? Working on a solution with 80 projects would be pretty rough in practice, I fear. (Context: winforms, not web) EDIT: (If you think this is a different question, leave me a comment) It seems to me that there are interdependencies between: Project/Solution structure for an application Folder/File structure Branch structure for source control (if you use branching) But I have great difficulty separating these out to consider them individually, if that is even possible. I have asked another related question here.

    Read the article

< Previous Page | 8 9 10 11 12 13 14 15 16 17 18 19  | Next Page >