Search Results

Search found 3698 results on 148 pages for 'dependency injection'.

Page 53/148 | < Previous Page | 49 50 51 52 53 54 55 56 57 58 59 60  | Next Page >

  • Parent Objects

    - by Ali Bahrami
    Support for Parent Objects was added in Solaris 11 Update 1. The following material is adapted from the PSARC arc case, and the Solaris Linker and Libraries Manual. A "plugin" is a shared object, usually loaded via dlopen(), that is used by a program in order to allow the end user to add functionality to the program. Examples of plugins include those used by web browsers (flash, acrobat, etc), as well as mdb and elfedit modules. The object that loads the plugin at runtime is called the "parent object". Unlike most object dependencies, the parent is not identified by name, but by its status as the object doing the load. Historically, building a good plugin is has been more complicated than it should be: A parent and its plugin usually share a 2-way dependency: The plugin provides one or more routines for the parent to call, and the parent supplies support routines for use by the plugin for things like memory allocation and error reporting. It is a best practice to build all objects, including plugins, with the -z defs option, in order to ensure that the object specifies all of its dependencies, and is self contained. However: The parent is usually an executable, which cannot be linked to via the usual library mechanisms provided by the link editor. Even if the parent is a shared object, which could be a normal library dependency to the plugin, it may be desirable to build plugins that can be used by more than one parent, in which case embedding a dependency NEEDED entry for one of the parents is undesirable. The usual way to build a high quality plugin with -z defs uses a special mapfile provided by the parent. This mapfile defines the parent routines, specifying the PARENT attribute (see example below). This works, but is inconvenient, and error prone. The symbol table in the parent already describes what it makes available to plugins — ideally the plugin would obtain that information directly rather than from a separate mapfile. The new -z parent option to ld allows a plugin to link to the parent and access the parent symbol table. This differs from a typical dependency: No NEEDED record is created. The relationship is recorded as a logical connection to the parent, rather than as an explicit object name However, it operates in the same manner as any other dependency in terms of making symbols available to the plugin. When the -z parent option is used, the link-editor records the basename of the parent object in the dynamic section, using the new tag DT_SUNW_PARENT. This is an informational tag, which is not used by the runtime linker to locate the parent, but which is available for diagnostic purposes. The ld(1) manpage documentation for the -z parent option is: -z parent=object Specifies a "parent object", which can be an executable or shared object, against which to link the output object. This option is typically used when creating "plugin" shared objects intended to be loaded by an executable at runtime via the dlopen() function. The symbol table from the parent object is used to satisfy references from the plugin object. The use of the -z parent option makes symbols from the object calling dlopen() available to the plugin. Example For this example, we use a main program, and a plugin. The parent provides a function named parent_callback() for the plugin to call. The plugin provides a function named plugin_func() to the parent: % cat main.c #include <stdio.h> #include <dlfcn.h> #include <link.h> void parent_callback(void) { printf("plugin_func() has called parent_callback()\n"); } int main(int argc, char **argv) { typedef void plugin_func_t(void); void *hdl; plugin_func_t *plugin_func; if (argc != 2) { fprintf(stderr, "usage: main plugin\n"); return (1); } if ((hdl = dlopen(argv[1], RTLD_LAZY)) == NULL) { fprintf(stderr, "unable to load plugin: %s\n", dlerror()); return (1); } plugin_func = (plugin_func_t *) dlsym(hdl, "plugin_func"); if (plugin_func == NULL) { fprintf(stderr, "unable to find plugin_func: %s\n", dlerror()); return (1); } (*plugin_func)(); return (0); } % cat plugin.c #include <stdio.h> extern void parent_callback(void); void plugin_func(void) { printf("parent has called plugin_func() from plugin.so\n"); parent_callback(); } Building this in the traditional manner, without -zdefs: % cc -o main main.c % cc -G -o plugin.so plugin.c % ./main ./plugin.so parent has called plugin_func() from plugin.so plugin_func() has called parent_callback() As noted above, when building any shared object, the -z defs option is recommended, in order to ensure that the object is self contained and specifies all of its dependencies. However, the use of -z defs prevents the plugin object from linking due to the unsatisfied symbol from the parent object: % cc -zdefs -G -o plugin.so plugin.c Undefined first referenced symbol in file parent_callback plugin.o ld: fatal: symbol referencing errors. No output written to plugin.so A mapfile can be used to specify to ld that the parent_callback symbol is supplied by the parent object. % cat plugin.mapfile $mapfile_version 2 SYMBOL_SCOPE { global: parent_callback { FLAGS = PARENT }; }; % cc -zdefs -Mplugin.mapfile -G -o plugin.so plugin.c However, the -z parent option to ld is the most direct solution to this problem, allowing the plugin to actually link against the parent object, and obtain the available symbols from it. An added benefit of using -z parent instead of a mapfile, is that the name of the parent object is recorded in the dynamic section of the plugin, and can be displayed by the file utility: % cc -zdefs -zparent=main -G -o plugin.so plugin.c % elfdump -d plugin.so | grep PARENT [0] SUNW_PARENT 0xcc main % file plugin.so plugin.so: ELF 32-bit LSB dynamic lib 80386 Version 1, parent main, dynamically linked, not stripped % ./main ./plugin.so parent has called plugin_func() from plugin.so plugin_func() has called parent_callback() We can also observe this in elfedit plugins on Solaris systems running Solaris 11 Update 1 or newer: % file /usr/lib/elfedit/dyn.so /usr/lib/elfedit/dyn.so: ELF 32-bit LSB dynamic lib 80386 Version 1, parent elfedit, dynamically linked, not stripped, no debugging information available Related Other Work The GNU ld has an option named --just-symbols that can be used in a similar manner: --just-symbols=filename Read symbol names and their addresses from filename, but do not relocate it or include it in the output. This allows your output file to refer symbolically to absolute locations of memory defined in other programs. You may use this option more than once. -z parent is a higher level operation aimed specifically at simplifying the construction of high quality plugins. Although it employs the same operation, it differs from --just symbols in 2 significant ways: There can only be one parent. The parent is recorded in the created object, and can be displayed by 'file', or other similar tools.

    Read the article

  • Flow-Design Cheat Sheet &ndash; Part I, Notation

    - by Ralf Westphal
    You want to avoid the pitfalls of object oriented design? Then this is the right place to start. Use Flow-Oriented Analysis (FOA) and –Design (FOD or just FD for Flow-Design) to understand a problem domain and design a software solution. Flow-Orientation as described here is related to Flow-Based Programming, Event-Based Programming, Business Process Modelling, and even Event-Driven Architectures. But even though “thinking in flows” is not new, I found it helpful to deviate from those precursors for several reasons. Some aim at too big systems for the average programmer, some are concerned with only asynchronous processing, some are even not very much concerned with programming at all. What I was looking for was a design method to help in software projects of any size, be they large or tiny, involing synchronous or asynchronous processing, being local or distributed, running on the web or on the desktop or on a smartphone. That´s why I took ideas from all of the above sources and some additional and came up with Event-Based Components which later got repositioned and renamed to Flow-Design. In the meantime this has generated some discussion (in the German developer community) and several teams have started to work with Flow-Design. Also I´ve conducted quite some trainings using Flow-Orientation for design. The results are very promising. Developers find it much easier to design software using Flow-Orientation than OOAD-based object orientation. Since Flow-Orientation is moving fast and is not covered completely by a single source like a book, demand has increased for at least an overview of the current state of its notation. This page is trying to answer this demand by briefly introducing/describing every notational element as well as their translation into C# source code. Take this as a cheat sheet to put next to your whiteboard when designing software. However, please do not expect any explanation as to the reasons behind Flow-Design elements. Details on why Flow-Design at all and why in this specific way you´ll find in the literature covering the topic. Here´s a resource page on Flow-Design/Event-Based Components, if you´re able to read German. Notation Connected Functional Units The basic element of any FOD are functional units (FU): Think of FUs as some kind of software code block processing data. For the moment forget about classes, methods, “components”, assemblies or whatever. See a FU as an abstract piece of code. Software then consists of just collaborating FUs. I´m using circles/ellipses to draw FUs. But if you like, use rectangles. Whatever suites your whiteboard needs best.   The purpose of FUs is to process input and produce output. FUs are transformational. However, FUs are not called and do not call other FUs. There is no dependency between FUs. Data just flows into a FU (input) and out of it (output). From where and where to is of no concern to a FU.   This way FUs can be concatenated in arbitrary ways:   Each FU can accept input from many sources and produce output for many sinks:   Flows Connected FUs form a flow with a start and an end. Data is entering a flow at a source, and it´s leaving it through a sink. Think of sources and sinks as special FUs which conntect wires to the environment of a network of FUs.   Wiring Details Data is flowing into/out of FUs through wires. This is to allude to electrical engineering which since long has been working with composable parts. Wires are attached to FUs usings pins. They are the entry/exit points for the data flowing along the wires. Input-/output pins currently need not be drawn explicitly. This is to keep designing on a whiteboard simple and quick.   Data flowing is of some type, so wires have a type attached to them. And pins have names. If there is only one input pin and output pin on a FU, though, you don´t need to mention them. The default is Process for a single input pin, and Result for a single output pin. But you´re free to give even single pins different names.   There is a shortcut in use to address a certain pin on a destination FU:   The type of the wire is put in parantheses for two reasons. 1. This way a “no-type” wire can be easily denoted, 2. this is a natural way to describe tuples of data.   To describe how much data is flowing, a star can be put next to the wire type:   Nesting – Boards and Parts If more than 5 to 10 FUs need to be put in a flow a FD starts to become hard to understand. To keep diagrams clutter free they can be nested. You can turn any FU into a flow: This leads to Flow-Designs with different levels of abstraction. A in the above illustration is a high level functional unit, A.1 and A.2 are lower level functional units. One of the purposes of Flow-Design is to be able to describe systems on different levels of abstraction and thus make it easier to understand them. Humans use abstraction/decomposition to get a grip on complexity. Flow-Design strives to support this and make levels of abstraction first class citizens for programming. You can read the above illustration like this: Functional units A.1 and A.2 detail what A is supposed to do. The whole of A´s responsibility is decomposed into smaller responsibilities A.1 and A.2. FU A thus does not do anything itself anymore! All A is responsible for is actually accomplished by the collaboration between A.1 and A.2. Since A now is not doing anything anymore except containing A.1 and A.2 functional units are devided into two categories: boards and parts. Boards are just containing other functional units; their sole responsibility is to wire them up. A is a board. Boards thus depend on the functional units nested within them. This dependency is not of a functional nature, though. Boards are not dependent on services provided by nested functional units. They are just concerned with their interface to be able to plug them together. Parts are the workhorses of flows. They contain the real domain logic. They actually transform input into output. However, they do not depend on other functional units. Please note the usage of source and sink in boards. They correspond to input-pins and output-pins of the board.   Implicit Dependencies Nesting functional units leads to a dependency tree. Boards depend on nested functional units, they are the inner nodes of the tree. Parts are independent, they are the leafs: Even though dependencies are the bane of software development, Flow-Design does not usually draw these dependencies. They are implicitly created by visually nesting functional units. And they are harmless. Boards are so simple in their functionality, they are little affected by changes in functional units they are depending on. But functional units are implicitly dependent on more than nested functional units. They are also dependent on the data types of the wires attached to them: This is also natural and thus does not need to be made explicit. And it pertains mainly to parts being dependent. Since boards don´t do anything with regard to a problem domain, they don´t care much about data types. Their infrastructural purpose just needs types of input/output-pins to match.   Explicit Dependencies You could say, Flow-Orientation is about tackling complexity at its root cause: that´s dependencies. “Natural” dependencies are depicted naturally, i.e. implicitly. And whereever possible dependencies are not even created. Functional units don´t know their collaborators within a flow. This is core to Flow-Orientation. That makes for high composability of functional units. A part is as independent of other functional units as a motor is from the rest of the car. And a board is as dependend on nested functional units as a motor is on a spark plug or a crank shaft. With Flow-Design software development moves closer to how hardware is constructed. Implicit dependencies are not enough, though. Sometimes explicit dependencies make designs easier – as counterintuitive this might sound. So FD notation needs a ways to denote explicit dependencies: Data flows along wires. But data does not flow along dependency relations. Instead dependency relations represent service calls. Functional unit C is depending on/calling services on functional unit S. If you want to be more specific, name the services next to the dependency relation: Although you should try to stay clear of explicit dependencies, they are fundamentally ok. See them as a way to add another dimension to a flow. Usually the functionality of the independent FU (“Customer repository” above) is orthogonal to the domain of the flow it is referenced by. If you like emphasize this by using different shapes for dependent and independent FUs like above. Such dependencies can be used to link in resources like databases or shared in-memory state. FUs can not only produce output but also can have side effects. A common pattern for using such explizit dependencies is to hook a GUI into a flow as the source and/or the sink of data: Which can be shortened to: Treat FUs others depend on as boards (with a special non-FD API the dependent part is connected to), but do not embed them in a flow in the diagram they are depended upon.   Attributes of Functional Units Creation and usage of functional units can be modified with attributes. So far the following have shown to be helpful: Singleton: FUs are by default multitons. FUs in the same of different flows with the same name refer to the same functionality, but to different instances. Think of functional units as objects that get instanciated anew whereever they appear in a design. Sometimes though it´s helpful to reuse the same instance of a functional unit; this is always due to valuable state it holds. Signify this by annotating the FU with a “(S)”. Multiton: FUs on which others depend are singletons by default. This is, because they usually are introduced where shared state comes into play. If you want to change them to be a singletons mark them with a “(M)”. Configurable: Some parts need to be configured before the can do they work in a flow. Annotate them with a “(C)” to have them initialized before any data items to be processed by them arrive. Do not assume any order in which FUs are configured. How such configuration is happening is an implementation detail. Entry point: In each design there needs to be a single part where “it all starts”. That´s the entry point for all processing. It´s like Program.Main() in C# programs. Mark the entry point part with an “(E)”. Quite often this will be the GUI part. How the entry point is started is an implementation detail. Just consider it the first FU to start do its job.   Patterns / Standard Parts If more than a single wire is attached to an output-pin that´s called a split (or fork). The same data is flowing on all of the wires. Remember: Flow-Designs are synchronous by default. So a split does not mean data is processed in parallel afterwards. Processing still happens synchronously and thus one branch after another. Do not assume any specific order of the processing on the different branches after the split.   It is common to do a split and let only parts of the original data flow on through the branches. This effectively means a map is needed after a split. This map can be implicit or explicit.   Although FUs can have multiple input-pins it is preferrable in most cases to combine input data from different branches using an explicit join: The default output of a join is a tuple of its input values. The default behavior of a join is to output a value whenever a new input is received. However, to produce its first output a join needs an input for all its input-pins. Other join behaviors can be: reset all inputs after an output only produce output if data arrives on certain input-pins

    Read the article

  • unable to install anything on ubuntu 9.10 with aptitude

    - by Srisa
    Hello, Earlier i could install software by using the 'sudo aptitude install ' command. Today when i tried to install rkhunter i am getting errors. It is not just rkhunter, i am not able to install anything. Here is the text output: user@server:~$ sudo aptitude install rkhunter ................ ................ 20% [3 rkhunter 947/271kB 0%] Get:4 http://archive.ubuntu.com karmic/universe unhide 20080519-4 [832kB] 40% [4 unhide 2955/832kB 0%] 100% [Working] Fetched 1394kB in 1s (825kB/s) Preconfiguring packages ... Selecting previously deselected package lsof. (Reading database ... ................ (Reading database ... 95% (Reading database ... 100% (Reading database ... 20076 files and directories currently installed.) Unpacking lsof (from .../lsof_4.81.dfsg.1-1_amd64.deb) ... dpkg: error processing /var/cache/apt/archives/lsof_4.81.dfsg.1-1_amd64.deb (--unpack): unable to create `/usr/bin/lsof.dpkg-new' (while processing `./usr/bin/lsof'): Permission denied dpkg-deb: subprocess paste killed by signal (Broken pipe) Selecting previously deselected package libmd5-perl. Unpacking libmd5-perl (from .../libmd5-perl_2.03-1_all.deb) ... Selecting previously deselected package rkhunter. Unpacking rkhunter (from .../rkhunter_1.3.4-5_all.deb) ... dpkg: error processing /var/cache/apt/archives/rkhunter_1.3.4-5_all.deb (--unpack): unable to create `/usr/bin/rkhunter.dpkg-new' (while processing `./usr/bin/rkhunter'): Permission denied dpkg-deb: subprocess paste killed by signal (Broken pipe) Selecting previously deselected package unhide. Unpacking unhide (from .../unhide_20080519-4_amd64.deb) ... dpkg: error processing /var/cache/apt/archives/unhide_20080519-4_amd64.deb (--unpack): unable to create `/usr/sbin/unhide-posix.dpkg-new' (while processing `./usr/sbin/unhide-posix'): Permission denied dpkg-deb: subprocess paste killed by signal (Broken pipe) Processing triggers for man-db ... Errors were encountered while processing: /var/cache/apt/archives/lsof_4.81.dfsg.1-1_amd64.deb /var/cache/apt/archives/rkhunter_1.3.4-5_all.deb /var/cache/apt/archives/unhide_20080519-4_amd64.deb E: Sub-process /usr/bin/dpkg returned an error code (1) A package failed to install. Trying to recover: Setting up libmd5-perl (2.03-1) ... Building dependency tree... 0% Building dependency tree... 50% Building dependency tree... 50% Building dependency tree Reading state information... 0% ........... .................... I have removed some lines to reduce the text. All the error messages are in here though. My experience with linux is limited and i am not sure what the problem is or how it is to be resolved. Thanks.

    Read the article

  • ClassFormatError when using javaee:javaee-api

    - by Digambar Daund
    This is my pom.xml <?xml version="1.0" encoding="UTF-8"?> <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd"> <modelVersion>4.0.0</modelVersion> <parent> <groupId>dd</groupId> <artifactId>jee6</artifactId> <version>0.0.1-SNAPSHOT</version> </parent> <groupId>dd</groupId> <artifactId>business-tier-impl</artifactId> <name>business-tier-impl</name> <version>0.0.1-SNAPSHOT</version> <packaging>ejb</packaging> <description>business-tier-impl</description> <dependencies> <dependency> <groupId>javax</groupId> <artifactId>javaee-api</artifactId> <version>6.0</version> <scope>provided</scope> </dependency> <dependency> <groupId>org.testng</groupId> <artifactId>testng</artifactId> <version>5.11</version> <scope>test</scope> <classifier>jdk15</classifier> </dependency> <dependency> <groupId>org.apache.openejb</groupId> <artifactId>openejb-core</artifactId> <version>3.1.2</version> <scope>test</scope> </dependency> </dependencies> <build> <plugins> <plugin> <artifactId>maven-compiler-plugin</artifactId> <configuration> <source>1.6</source> <target>1.6</target> </configuration> </plugin> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-ejb-plugin</artifactId> <configuration> <ejbVersion>3.1.2</ejbVersion> </configuration> </plugin> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-surefire-plugin</artifactId> </plugin> </plugins> </build> </project> Below is the testcase setup methhod: @BeforeClass public void bootContainer() throws Exception { Properties props = new Properties(); props.setProperty(Context.INITIAL_CONTEXT_FACTORY, LocalInitialContextFactory.class.getName()); Context context = new InitialContext(props); service = (HelloService) context.lookup("HelloServiceLocal"); } I get error at line where InitialContext() is created... Apache OpenEJB 3.1 build: 20081009-03:31 http://openejb.apache.org/ INFO - openejb.home = C:\DD\WORKSPACES\jee6\business-tier-impl INFO - openejb.base = C:\DD\WORKSPACES\jee6\business-tier-impl FATAL - OpenEJB has encountered a fatal error and cannot be started: OpenEJB encountered an unexpected error while attempting to instantiate the assembler. java.lang.ClassFormatError: Absent Code attribute in method that is not native or abstract in class file javax/resource/spi/ResourceAdapterInternalException . . . FAILED CONFIGURATION: @BeforeClass bootContainer javax.naming.NamingException: Attempted to load OpenEJB. OpenEJB has encountered a fatal error and cannot be started: OpenEJB encountered an unexpected error while attempting to instantiate the assembler.: Absent Code attribute in method that is not native or abstract in class file javax/resource/spi/ResourceAdapterInternalException [Root exception is org.apache.openejb.OpenEJBException: OpenEJB has encountered a fatal error and cannot be started: OpenEJB encountered an unexpected error while attempting to instantiate the assembler.: Absent Code attribute in method that is not native or abstract in class file javax/resource/spi/ResourceAdapterInternalException] at org.apache.openejb.client.LocalInitialContextFactory.init(LocalInitialContextFactory.java:54) at org.apache.openejb.client.LocalInitialContextFactory.getInitialContext(LocalInitialContextFactory.java:41) at javax.naming.spi.NamingManager.getInitialContext(NamingManager.java:667) at javax.naming.InitialContext.getDefaultInitCtx(InitialContext.java:288) at javax.naming.InitialContext.init(InitialContext.java:223) at javax.naming.InitialContext.<init>(InitialContext.java:197) at dd.jee6.app.HelloServiceTest.bootContainer(HelloServiceTest.java:26) Caused by: org.apache.openejb.OpenEJBException: OpenEJB has encountered a fatal error and cannot be started: OpenEJB encountered an unexpected error while attempting to instantiate the assembler.: Absent Code attribute in method that is not native or abstract in class file javax/resource/spi/ResourceAdapterInternalException at org.apache.openejb.OpenEJB$Instance.<init>(OpenEJB.java:133) at org.apache.openejb.OpenEJB.init(OpenEJB.java:299) at org.apache.openejb.OpenEJB.init(OpenEJB.java:278) at org.apache.openejb.loader.OpenEJBInstance.init(OpenEJBInstance.java:36) at org.apache.openejb.client.LocalInitialContextFactory.init(LocalInitialContextFactory.java:69) at org.apache.openejb.client.LocalInitialContextFactory.init(LocalInitialContextFactory.java:52) ... 28 more Caused by: java.lang.ClassFormatError: Absent Code attribute in method that is not native or abstract in class file javax/resource/spi/ResourceAdapterInternalException at java.lang.ClassLoader.defineClass1(Native Method)

    Read the article

  • Load application context problem in Maven managed spring-test TestNg

    - by joejax
    I try to setup a project with spring-test using TestNg in Maven. The code is like: @ContextConfiguration(locations={"test-context.xml"}) public class AppTest extends AbstractTestNGSpringContextTests { @Test public void testApp() { assert true; } } A test-context.xml simply defined a bean: <bean id="app" class="org.sonatype.mavenbook.simple.App"/> I got error for Failed to load ApplicationContext when running mvn test from command line, seems it cannot find the test-context.xml file; however, I can get it run correctly inside Eclipse (with TestNg plugin). So, test-context.xml is under src/test/resources/, how do I indicate this in the pom.xml so that 'mvn test' command will work? Thanks, UPDATE: Thanks for the reply. Cannot load context file error was caused by I moved the file arround in different location since I though the classpath was the problem. Now I found the context file seems loaded from the Maven output, but the test is failed: Running TestSuite May 25, 2010 9:55:13 AM org.springframework.beans.factory.xml.XmlBeanDefinitionReader loadBeanDefinitions INFO: Loading XML bean definitions from class path resource [test-context.xml] May 25, 2010 9:55:13 AM org.springframework.context.support.AbstractApplicationContext prepareRefresh INFO: Refreshing org.springframework.context.support.GenericApplicationContext@171bbc9: display name [org.springframework.context.support.GenericApplicationContext@171bbc9]; startup date [Tue May 25 09:55:13 PDT 2010]; root of context hierarchy May 25, 2010 9:55:13 AM org.springframework.context.support.AbstractApplicationContext obtainFreshBeanFactory INFO: Bean factory for application context [org.springframework.context.support.GenericApplicationContext@171bbc9]: org.springframework.beans.factory.support.DefaultListableBeanFactory@1df8b99 May 25, 2010 9:55:13 AM org.springframework.beans.factory.support.DefaultListableBeanFactory preInstantiateSingletons INFO: Pre-instantiating singletons in org.springframework.beans.factory.support.DefaultListableBeanFactory@1df8b99: defining beans [app,org.springframework.context.annotation.internalCommonAnnotationProcessor,org.springframework.context.annotation.internalAutowiredAnnotationProcessor,org.springframework.context.annotation.internalRequiredAnnotationProcessor]; root of factory hierarchy Tests run: 3, Failures: 2, Errors: 0, Skipped: 1, Time elapsed: 0.63 sec If I use spring-test version 3.0.2.RELEASE, the error becomes: org.springframework.test.context.testng.AbstractTestNGSpringContextTests.springTestContextPrepareTestInstance() is depending on nonexistent method null Here is the structure of the project: simple |-- pom.xml `-- src |-- main | `-- java `-- test |-- java `-- resources |-- test-context.xml `-- testng.xml testng.xml: <suite name="Suite" parallel="false"> <test name="Test"> <classes> <class name="org.sonatype.mavenbook.simple.AppTest"/> </classes> </test> </suite> test-context.xml: <beans xmlns="http://www.springframework.org/schema/beans" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-2.0.xsd" default-lazy-init="true"> <bean id="app" class="org.sonatype.mavenbook.simple.App"/> </beans> In the pom.xml, I add testng, spring, and spring-test artifacts, and plugin: <dependency> <groupId>org.testng</groupId> <artifactId>testng</artifactId> <version>5.1</version> <classifier>jdk15</classifier> <scope>test</scope> </dependency> <dependency> <groupId>org.springframework</groupId> <artifactId>spring</artifactId> <version>2.5.6</version> </dependency> <dependency> <groupId>org.springframework</groupId> <artifactId>spring-test</artifactId> <version>2.5.6</version> <scope>test</scope> </dependency> <build> <finalName>simple</finalName> <plugins> <plugin> <artifactId>maven-compiler-plugin</artifactId> <configuration> <source>1.6</source> <target>1.6</target> </configuration> </plugin> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-surefire-plugin</artifactId> <configuration> <suiteXmlFiles> <suiteXmlFile>src/test/resources/testng.xml</suiteXmlFile> </suiteXmlFiles> </configuration> </plugin> </plugins> Basically, I replaced 'A Simple Maven Project' Junit with TestNg, hope it works.

    Read the article

  • How can I resolve gstreamer dependencies in Ubuntu

    - by michael
    Hi, Can you please tell me how can I resolve these dependencies on ubuntu: checking for GSTREAMER... configure: error: Package requirements (gstreamer-0.10 >= 0.10 gstreamer-app-0.10 gstreamer-base-0.10 gstreamer-pbutils-0.10 gstreamer-plugins-base-0.10 >= 0.10.25 gstreamer-video-0.10) were not met: No package 'gstreamer-app-0.10' found No package 'gstreamer-pbutils-0.10' found No package 'gstreamer-plugins-base-0.10' found No package 'gstreamer-video-0.10' found I have tried: $ sudo apt-get install *gstreamer-video* Reading package lists... Done Building dependency tree Reading state information... Done E: Regex compilation error - Invalid preceding regular expression $ sudo apt-get install *gstreamer-app* Reading package lists... Done Building dependency tree Reading state information... Done E: Regex compilation error - Invalid preceding regular expression $ sudo apt-get install *gstreamer-base* Reading package lists... Done Building dependency tree Reading state information... Done E: Regex compilation error - Invalid preceding regular expression

    Read the article

  • Importing .dll into Qt

    - by Bad Man
    I want to bring a .dll dependency into my Qt project. So I added this to my .pro file: win32 { LIBS += C:\lib\dependency.lib LIBS += C:\lib\dependency.dll } And then (I don't know if this is the right syntax or not) #include <windows.h> Q_DECL_IMPORT int WINAPI DoSomething(); btw the .dll looks something like this: #include <windows.h> BOOL APIENTRY DllMain(HANDLE hModule, DWORD ul_reason_for_call, LPVOID lpReserved) { return TRUE; } extern "C" { int WINAPI DoSomething() { return -1; } }; Getting error: unresolved symbol? Note: I'm not experienced with .dll's outside of .NET's ez pz assembly architechture, definitely a n00b.

    Read the article

  • Having an issue with Onejar-Maven-Plugin

    - by reverendgreen
    I am trying to package a simple maven java project (uses javax.persistence api) into a single jar using the onejar-maven-plugin. I can run the program fine in eclipse; however when execute the onejar I get the exception below. If someone could provide some insight, that would be appreciated. Thanks, RG Exception in thread "main" java.lang.reflect.InvocationTargetException at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) at java.lang.reflect.Method.invoke(Unknown Source) at com.simontuffs.onejar.Boot.run(Boot.java:306) at com.simontuffs.onejar.Boot.main(Boot.java:159) Caused by: Exception [EclipseLink-30005] (Eclipse Persistence Services - 2.0.0.v20091127-r5931): org.eclipse.persistence.exceptions.PersistenceUnitLoadingExcept Exception Description: An exception was thrown while searching for persistence archives with ClassLoader: com.simontuffs.onejar.JarClassLoader@190d11 Internal Exception: java.lang.ClassCastException: sun.misc.Launcher$AppClassLoader cannot be cast to com.simontuffs.onejar.JarClassLoader at org.eclipse.persistence.exceptions.PersistenceUnitLoadingException.exceptionSearchingForPersistenceResources(PersistenceUnitLoadingException.java:126 at org.eclipse.persistence.jpa.PersistenceProvider.createEntityManagerFactory(PersistenceProvider.java:133) at org.eclipse.persistence.jpa.PersistenceProvider.createEntityManagerFactory(PersistenceProvider.java:65) at javax.persistence.Persistence.createEntityManagerFactory(Persistence.java:78) at javax.persistence.Persistence.createEntityManagerFactory(Persistence.java:54) at com.test.onejartest.App.main(App.java:19) ... 6 more Caused by: java.lang.ClassCastException: sun.misc.Launcher$AppClassLoader cannot be cast to com.simontuffs.onejar.JarClassLoader at com.simontuffs.onejar.JarClassLoader.getByteStream(JarClassLoader.java:753) at com.simontuffs.onejar.Handler$1.getInputStream(Handler.java:50) at java.net.URL.openStream(Unknown Source) at org.eclipse.persistence.internal.jpa.deployment.ArchiveFactoryImpl.isJarInputStream(ArchiveFactoryImpl.java:124) at org.eclipse.persistence.internal.jpa.deployment.ArchiveFactoryImpl.createArchive(ArchiveFactoryImpl.java:106) at org.eclipse.persistence.internal.jpa.deployment.PersistenceUnitProcessor.findPersistenceArchives(PersistenceUnitProcessor.java:213) at org.eclipse.persistence.internal.jpa.deployment.JPAInitializer.findPersistenceUnitInfoInArchives(JPAInitializer.java:134) at org.eclipse.persistence.internal.jpa.deployment.JPAInitializer.findPersistenceUnitInfo(JPAInitializer.java:125) at org.eclipse.persistence.jpa.PersistenceProvider.createEntityManagerFactory(PersistenceProvider.java:98) ... 10 more pom.xml: ... <pluginRepositories> <pluginRepository> <id>onejar-maven-plugin.googlecode.com</id> <url>http://onejar-maven-plugin.googlecode.com/svn/mavenrepo</url> </pluginRepository> </pluginRepositories> ... <build> <plugins> <plugin> <groupId>org.dstovall</groupId> <artifactId>onejar-maven-plugin</artifactId> <version>1.3.0</version> <executions> <execution> <configuration> <mainClass>com.test.onejartest.App</mainClass> </configuration> <goals> <goal>one-jar</goal> </goals> </execution> </executions> </plugin> <plugin> <artifactId>maven-compiler-plugin</artifactId> <version>2.3.1</version> <configuration> <target>1.6</target> <source>1.6</source> </configuration> </plugin> </plugins> </build> ... <dependencies> <dependency> <groupId>javax.persistence</groupId> <artifactId>persistence-api</artifactId> <version>2.0.0</version> </dependency> <dependency> <groupId>org.eclipse.persistence</groupId> <artifactId>eclipselink</artifactId> <version>2.0.0</version> </dependency> <dependency> <groupId>com.microsoft.sqlserver.jdbc</groupId> <artifactId>sqljdbc</artifactId> <version>4.0.0</version> </dependency> </dependencies>

    Read the article

  • springTestContextBeforeTestMethod failed in Maven spring-test

    - by joejax
    I try to setup a project with spring-test using TestNg in Maven. The code is like: @ContextConfiguration(locations={"test-context.xml"}) public class AppTest extends AbstractTestNGSpringContextTests { @Test public void testApp() { assert true; } } A test-context.xml simply defined a bean: <bean id="app" class="org.sonatype.mavenbook.simple.App"/> I got error for Failed to load ApplicationContext when running mvn test from command line, seems it cannot find the test-context.xml file; however, I can get it run correctly inside Eclipse (with TestNg plugin). So, test-context.xml is under src/test/resources/, how do I indicate this in the pom.xml so that 'mvn test' command will work? Thanks, UPDATE: Thanks for the reply. Cannot load context file error was caused by I moved the file arround in different location since I though the classpath was the problem. Now I found the context file seems loaded from the Maven output, but the test is failed: Running TestSuite May 25, 2010 9:55:13 AM org.springframework.beans.factory.xml.XmlBeanDefinitionReader loadBeanDefinitions INFO: Loading XML bean definitions from class path resource [test-context.xml] May 25, 2010 9:55:13 AM org.springframework.context.support.AbstractApplicationContext prepareRefresh INFO: Refreshing org.springframework.context.support.GenericApplicationContext@171bbc9: display name [org.springframework.context.support.GenericApplicationContext@171bbc9]; startup date [Tue May 25 09:55:13 PDT 2010]; root of context hierarchy May 25, 2010 9:55:13 AM org.springframework.context.support.AbstractApplicationContext obtainFreshBeanFactory INFO: Bean factory for application context [org.springframework.context.support.GenericApplicationContext@171bbc9]: org.springframework.beans.factory.support.DefaultListableBeanFactory@1df8b99 May 25, 2010 9:55:13 AM org.springframework.beans.factory.support.DefaultListableBeanFactory preInstantiateSingletons INFO: Pre-instantiating singletons in org.springframework.beans.factory.support.DefaultListableBeanFactory@1df8b99: defining beans [app,org.springframework.context.annotation.internalCommonAnnotationProcessor,org.springframework.context.annotation.internalAutowiredAnnotationProcessor,org.springframework.context.annotation.internalRequiredAnnotationProcessor]; root of factory hierarchy Tests run: 3, Failures: 2, Errors: 0, Skipped: 1, Time elapsed: 0.63 sec <<< FAILURE! Results : Failed tests: springTestContextBeforeTestMethod(org.sonatype.mavenbook.simple.AppTest) springTestContextAfterTestMethod(org.sonatype.mavenbook.simple.AppTest) Tests run: 3, Failures: 2, Errors: 0, Skipped: 1 If I use spring-test version 3.0.2.RELEASE, the error becomes: org.springframework.test.context.testng.AbstractTestNGSpringContextTests.springTestContextPrepareTestInstance() is depending on nonexistent method null Here is the structure of the project: simple |-- pom.xml `-- src |-- main | `-- java `-- test |-- java `-- resources |-- test-context.xml `-- testng.xml testng.xml: <suite name="Suite" parallel="false"> <test name="Test"> <classes> <class name="org.sonatype.mavenbook.simple.AppTest"/> </classes> </test> </suite> test-context.xml: <beans xmlns="http://www.springframework.org/schema/beans" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-2.0.xsd" default-lazy-init="true"> <bean id="app" class="org.sonatype.mavenbook.simple.App"/> </beans> In the pom.xml, I add testng, spring, and spring-test artifacts, and plugin: <dependency> <groupId>org.testng</groupId> <artifactId>testng</artifactId> <version>5.1</version> <classifier>jdk15</classifier> <scope>test</scope> </dependency> <dependency> <groupId>org.springframework</groupId> <artifactId>spring</artifactId> <version>2.5.6</version> </dependency> <dependency> <groupId>org.springframework</groupId> <artifactId>spring-test</artifactId> <version>2.5.6</version> <scope>test</scope> </dependency> <build> <finalName>simple</finalName> <plugins> <plugin> <artifactId>maven-compiler-plugin</artifactId> <configuration> <source>1.6</source> <target>1.6</target> </configuration> </plugin> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-surefire-plugin</artifactId> <configuration> <suiteXmlFiles> <suiteXmlFile>src/test/resources/testng.xml</suiteXmlFile> </suiteXmlFiles> </configuration> </plugin> </plugins> Basically, I replaced 'A Simple Maven Project' Junit with TestNg, hope it works. UPDATE: I think I got the problem (still don't know why) - Whenever I extends AbstractTestNGSpringContextTests or AbstractTransactionalTestNGSpringContextTests, the test will failed with this error: Failed tests: springTestContextBeforeTestMethod(org.sonatype.mavenbook.simple.AppTest) springTestContextAfterTestMethod(org.sonatype.mavenbook.simple.AppTest) So, eventually the error went away when I override the two methods. I don't think this is the right way, didn't find much info from spring-test doc. If you know spring test framework, please shred some light on this.

    Read the article

  • Huge EAR deployment

    - by bozo
    Hello all, I'm trying to figure out how to deploy a huge (40-50 MB) EAR file to the server through a rather slow VPN connection. The EAR contains EJB and WAR projects created in Glassfish, and 90% of the file size is from external dependency libraries used. Has anyone came up with a strategy for elegant deployment to production system from Netbeans, where the deployment (over the network) is done only for what is really needed (i.e. just one WAR, not the entire EAR, or just one lib, not the entire libraries subproject). Related to the first point, how to separate external dependency libs from project in Netbeans, so that the project compiles on development machine, but when the EAR/WAR/EJB is created it does not contain all the dependency JARs, which are making it huge. Perhaps we need to write custom ant script? Start using maven? Thank you all for kind answers, Bozo

    Read the article

  • SQLDependency thread

    - by user171523
    i am in the process implementing SQLdepenency i would like to know in case of Dependency Handler exeuctues will it spun a different thred from main Process ? What will happen when the event handler triggers? Do i need to worry about any multithreds issues? public void CreateSqlDependency() { try { using (SqlConnection connection = (SqlConnection)DBFactory.GetDBFactoryConnection(Constants.SQL_PROVIDER_NAME)) { SqlCommand command = (SqlCommand)DBFactory.GetCommand(Constants.SQL_PROVIDER_NAME); command.CommandText = watchQuery; command.CommandType = CommandType.Text; SqlDependency dependency = new SqlDependency(command); //Create the callback object dependency.OnChange += new OnChangeEventHandler(this.QueueChangeNotificationHandler); SqlDependency.Start(connectionString); DataTable dataTable = DBFactory.ExecuteSPReDT(command); } } catch (SqlException sqlExp) { throw sqlExp; } catch (Exception ex) { throw ex; } } public void QueueChangeNotificationHandler(object caller, SqlNotificationEventArgs e) { if(e.Info == SqlNotificationInfo.Insert) Fire(); }

    Read the article

  • Disposing of Objects with long living dependencies

    - by Ray Booysen
    public class ABC { public ABC(IEventableInstance dependency) { dependency.ANewEvent += MyEventHandler; } private void MyEventHandler(object sender, EventArgs e) { //Do Stuff } } Let us say that an instance of ABC is a long living object and that my dependency is an even longer running object. When an instance of ABC needs to be cleaned up, I have two options. One I could have a Cleanup() method to unsubscribe from the ANewEvent event or I could implement IDisposable and in the Dispose unwire the event. Now I have no control over whether the consumer will call the dispose method or even the Cleanup method should I go that route. Should I implement a Finaliser and unsubscribe then? It feels dirty but I do not want hanging instances of ABC around. Thoughts?

    Read the article

  • Missing artifact error in Maven

    - by abhin4v
    I get a missing artifact error during Maven build because one of the dependencies declares it's parent artifact using a property for the version. Now the property itself is declared in the parent pom and my project's build fails giving this error: [ERROR] Failed to execute goal on project abc: Unable to get dependency information for xyz:pqr:jar:SNAPSHOT: Failed to process POM for xyz:pqr:jar:SNAPSHOT: Non-resolvable parent POM xyz:pqr-parent:${someversion} for xyz:pqr:${someversion}: Failed to resolve POM for xyz:pqr-parent:${someversion} due to Missing: ---------- 1) xyz:pqr-parent:pom:${someversion} ---------- 1 required artifact is missing. for artifact: xyz:pqr-parent:pom:${someversion} I have verified that the artifacts are present in correct location in the repository. Is there a way to specify the value of someversion property used in the dependency pom? If not, how should the dependency pom be changed to resolve the error?

    Read the article

  • Some doubts in WPF

    - by xyz
    I am entirely new to WPF . Though I am going thru the tutorials and articles from net but I am having doubts . Some of the questions which have striked out so far are a) What is routed events and what extra thing it does server? b) What are dependency properties and it's benefits. c) How dependency properties differ from attached properties. d) Why does first bubbling happens follwed by tunneling? And what is the purpose of this? e.g. <canvas> <button canvas.left=10/> </canvas> Is it a dependency property or attached property? Thanks

    Read the article

  • maven provided scope

    - by kamaci
    <dependency> <groupId>javax.servlet</groupId> <artifactId>servlet-api</artifactId> <version>2.5</version> <scope>provided</scope> </dependency> I use that dependency import at my project's pom.xml. My question is I declared 2.5 as version. However does it important to write a lower version? For example I mean that if my project uses 3.0 version and I write that 2.5 will be provided? (I mean that let's accept that 2.5 is fine and my project works well, If I don't change anything else and just change 2.5 to 2.0 does it cause to an error?)

    Read the article

  • How to prevent maven to resolve dependencies in local repository

    - by Nils Schmidt
    Is there a way to tell maven (when doing mvn package, mvn site or ...) not to resolve the dependencies from the local repository? Background of this question: Sometimes I get into problems, when previously cached dependencies (e.g. SomeProject-0.7-ALPHA) are no longer available in the remote repository. In my local build everything still works fine as the dependency has been cached before. As soon as I share my pom with others, they may get into trouble, as they dont have a cached version of that dependency and the dependency can no longer be resolved from the remote repository. Any help will be appreciated. Thanks in advance!

    Read the article

  • <jaxrs:client> not getting autowired

    - by himangshu
    I am trying to build a restful client using jaxrs:client as defined in http://svn.apache.org/repos/asf/cxf/trunk/systests/jaxrs/src/test/resources/jaxrs_soap_rest/WEB-INF/beans.xml In my test class I am getting org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'com.abc.service.ExportServiceTest': Injection of autowired dependencies failed; nested exception is org.springframework.beans.factory.BeanCreationException: Could not autowire field: private com.bankbazaar.service.ExportService com.abc.service.ExportServiceTest.exportClient; nested exception is org.springframework.beans.factory.NoSuchBeanDefinitionException: No matching bean of type [com.abc.service.ExportService] found for dependency: expected at least 1 bean which qualifies as autowire candidate for this dependency. Dependency annotations: {@org.springframework.beans.factory.annotation.Autowired(required=true), @org.springframework.beans.factory.annotation.Qualifier(value=exportClient)} this is my spring config However exportClient=(ExportService)applicationContext.getBean("exportClient"); this works. Thanks Himangshu

    Read the article

  • RMO on Windows 7 - The Specified module could not be found

    - by AGoodDisplayName
    My machine: Windows XP (x86), VS2008, .net 3.5, sql server 2005, WinForms - App works fine. Production Machines: Windows 7 (x64), SQl Server 2005 Express - App Starts but throws exception Visual Studio Targeting x86 on setup project and RMO project. Visual Studio gives me a a couple warnings but I can still build: Unable to find dependency 'MICROSOFT.SQLSERVER.MANAGEMENT.SQLPARSER' (Signature='89845DCD8080CC91' Version='10.0.0.0') of assembly 'Microsoft.SqlServer.Smo.dll' Unable to find dependency 'MICROSOFT.SQLSERVER.MANAGEMENT.SQLPARSER' (Signature='89845DCD8080CC91' Version='10.0.0.0') of assembly 'Microsoft.SqlServer.Management.SmoMetadataProvider.dll' This is a simple RMO (Replication Management Objects) app that initiates a pull subscription in SQL Server 2005 and displays status. Works fine on my machine, but fails on the production machine. I'm using a setup project to install the app on the production machine, but I think I'm missing a dependency somewhere, but I can't figure it out. On the production machine the app starts fine, but when I try to synch the subscription i get: System.IO.FileNotFoundException: The Specified module could not be found. (Exception from HResult: 0x8007007E)

    Read the article

  • Detect all depencies of an application

    - by Ian
    Hi All, I am in the process of "detecting" (more like listing down) all of the dependencies of our application. Currently, I am using depends.exe (Dependency Walker) to detect all of the file dependencies. I was actually able to get pass all the error messages about missing files and dependencies. However, when launching the app, all I get is a crash without any messages at all. On a "working" configuration/system, I was able to launch this app successfully. Killing a certain service will produce the "crashing" behavior. This leads me to the conclusion that SOMETHING on this service is needed by the App and this service is a dependency. However, depends.exe will not be able to "detect" this dependency. My question is: Is there an application that can programmatically detect dependencies such as Database and Services? Thanks!

    Read the article

  • Maven giving error Missing artifact

    - by newbie
    I'm adding Google App Engine, Spring and Tiles2 to same project, for some reason Apache Maven gives this error. Missing artifact org.apache.commons:com.springsource.org.apache.commons.collections:jar:3.2.1:compile here is my pom.xml where i include dependency: <dependency> <groupId>org.apache.commons</groupId> <artifactId>com.springsource.org.apache.commons.collections</artifactId> <version>3.2.1</version> <type>pom</type> </dependency>

    Read the article

  • 0.20.2 API hadoop version with java 5

    - by abdeslam
    I have started a maven project trying to implement the MapReduce algorithm in java 1.5.0_14. I have chosen the 0.20.2 API hadoop version. In the pom.xml i'm using thus the following dependency: < dependency < groupId>org.apache.hadoop< /groupId> < artifactId>hadoop-core< /artifactId> < version>0.20.2< /version> < /dependency But when I'm using an import to the org.apache.hadoop classes, I get the following error: bad class file: ${HOME_DIR}\repository\org\apache\hadoop\hadoop-core\0.20.2\hadoop-core-0.20.2.jar(org/apache/hadoop/fs/Path.class) class file has wrong version 50.0, should be 49.0. Does someone know how can I solve this issue. Thanks.

    Read the article

  • Eclipse compiler with APT

    - by westcam
    What is the correct way to call the Eclipse compiler with an APT processor from Java? I am using the following Maven dependency for the compiler <dependency> <groupId>org.eclipse.jdt.core.compiler</groupId> <artifactId>ecj</artifactId> <version>3.5.1</version> </dependency> I want to test an APT processor with the Eclipse compiler in addition to Javac.

    Read the article

  • Why would someone use WHERE 1=1 AND <conditions> in a SQL clause?

    - by Bogdan Maxim
    Why would someone use WHERE 1=1 AND <conditions> in a SQL clause (Either SQL obtained through concatenated strings, either view definition) I've seen somewhere that this would be used to protect against SQL Injection, but it seems very weird. If there is injection WHERE 1 = 1 AND injected OR 1=1 would have the same result as injected OR 1=1. Later edit: What about the usage in a view definition?

    Read the article

  • google salve - maven

    - by chris-gr
    Hi all, I tried to install google salve http://code.google.com/p/salve/ by adding following statements in the project's pom file: However, when running mvn dependency:resolve it states "unable to find resource "salve:salve:jar:2.0" in repository salve What's wrong? <dependency> <groupId>salve</groupId> <artifactId>salve</artifactId> <version>2.0</version> </dependency> <repository> <id>salve</id> <name>Google Maven Repository</name> <url>http://salve.googlecode.com/svn/maven2/</url> </repository>

    Read the article

  • Running lmgrd on ubuntu 14.04 LTS

    - by SumanBhatR
    I have installed Xilinx 14.7 in ubuntu 14.04 LTS machine(i386 - 64bit). But I am unable to run lmgrd (for starting the license server). When I googled this problem, I found that lsb-core package needs to be installed. But the package is having many dependencies, I want to know how to install lsb-core package with all the necessary dependencies. Thanks for the help On running sudo apt-get install lsb-core I got the following output Reading package lists... Done Building dependency tree Reading state information... Done Package lsb-core is not available, but is referred to by another package. This may mean that the package is missing, has been obsoleted, or is only available from another source E: Package 'lsb-core' has no installation candidate So I downloaded lsb-core package from http://packages.ubuntu.com/trusty/misc/lsb-core site and used "sudo dpkg -i ./lsb-core_4.1+Debian11ubuntu6_i386.deb" to install it By doing it, I got the following output Selecting previously unselected package lsb-core. (Reading database ... 163205 files and directories currently installed.) Preparing to unpack .../lsb-core_4.1+Debian11ubuntu6_i386.deb ... Unpacking lsb-core (4.1+Debian11ubuntu6) ... dpkg: dependency problems prevent configuration of lsb-core: lsb-core depends on libc6 ( 2.3.5). lsb-core depends on libz1. lsb-core depends on libncurses5. lsb-core depends on libpam0g. lsb-core depends on lsb-invalid-mta (= 4.1+Debian11ubuntu6) | mail-transport-agent. lsb-core depends on at. lsb-core depends on binutils. lsb-core depends on cron | cron-daemon. lsb-core depends on libc6-dev | libc-dev. lsb-core depends on locales. lsb-core depends on m4. lsb-core depends on mailx | mailutils. lsb-core depends on ncurses-term. lsb-core depends on pax. lsb-core depends on psmisc. lsb-core depends on alien (= 8.36). lsb-core depends on python3. lsb-core depends on lsb-security (= 4.1+Debian11ubuntu6). lsb-core depends on time. dpkg: error processing package lsb-core (--install): dependency problems - leaving unconfigured Processing triggers for man-db (2.6.7.1-1) ... Errors were encountered while processing: lsb-core So I want to know how to install lsb-core package with all the necessary dependencies in one go. Thanks for the help

    Read the article

< Previous Page | 49 50 51 52 53 54 55 56 57 58 59 60  | Next Page >