Search Results

Search found 5188 results on 208 pages for 'cross compilation'.

Page 118/208 | < Previous Page | 114 115 116 117 118 119 120 121 122 123 124 125  | Next Page >

  • How to merge two different Makefiles?

    - by martijnn2008
    I have did some reading on "Merging Makefiles", one suggest I should leave the two Makefiles separate in different folders [1]. For me this look counter intuitive, because I have the following situation: I have 3 source files (main.cpp flexibility.cpp constraints.cpp) one of them (flexibility.cpp) is making use of the COIN-OR Linear Programming library (Clp) When installing this library on my computer it makes sample Makefiles, which I have adjust the Makefile and it currently makes a good working binary. # Copyright (C) 2006 International Business Machines and others. # All Rights Reserved. # This file is distributed under the Eclipse Public License. # $Id: Makefile.in 726 2006-04-17 04:16:00Z andreasw $ ########################################################################## # You can modify this example makefile to fit for your own program. # # Usually, you only need to change the five CHANGEME entries below. # ########################################################################## # To compile other examples, either changed the following line, or # add the argument DRIVER=problem_name to make DRIVER = main # CHANGEME: This should be the name of your executable EXE = clp # CHANGEME: Here is the name of all object files corresponding to the source # code that you wrote in order to define the problem statement OBJS = $(DRIVER).o constraints.o flexibility.o # CHANGEME: Additional libraries ADDLIBS = # CHANGEME: Additional flags for compilation (e.g., include flags) ADDINCFLAGS = # CHANGEME: Directory to the sources for the (example) problem definition # files SRCDIR = . ########################################################################## # Usually, you don't have to change anything below. Note that if you # # change certain compiler options, you might have to recompile the # # COIN package. # ########################################################################## COIN_HAS_PKGCONFIG = TRUE COIN_CXX_IS_CL = #TRUE COIN_HAS_SAMPLE = TRUE COIN_HAS_NETLIB = #TRUE # C++ Compiler command CXX = g++ # C++ Compiler options CXXFLAGS = -O3 -pipe -DNDEBUG -pedantic-errors -Wparentheses -Wreturn-type -Wcast-qual -Wall -Wpointer-arith -Wwrite-strings -Wconversion -Wno-unknown-pragmas -Wno-long-long -DCLP_BUILD # additional C++ Compiler options for linking CXXLINKFLAGS = -Wl,--rpath -Wl,/home/martijn/Downloads/COIN/coin-Clp/lib # C Compiler command CC = gcc # C Compiler options CFLAGS = -O3 -pipe -DNDEBUG -pedantic-errors -Wimplicit -Wparentheses -Wsequence-point -Wreturn-type -Wcast-qual -Wall -Wno-unknown-pragmas -Wno-long-long -DCLP_BUILD # Sample data directory ifeq ($(COIN_HAS_SAMPLE), TRUE) ifeq ($(COIN_HAS_PKGCONFIG), TRUE) CXXFLAGS += -DSAMPLEDIR=\"`PKG_CONFIG_PATH=/home/martijn/Downloads/COIN/coin-Clp/lib64/pkgconfig:/home/martijn/Downloads/COIN/coin-Clp/lib/pkgconfig:/home/martijn/Downloads/COIN/coin-Clp/share/pkgconfig: pkg-config --variable=datadir coindatasample`\" CFLAGS += -DSAMPLEDIR=\"`PKG_CONFIG_PATH=/home/martijn/Downloads/COIN/coin-Clp/lib64/pkgconfig:/home/martijn/Downloads/COIN/coin-Clp/lib/pkgconfig:/home/martijn/Downloads/COIN/coin-Clp/share/pkgconfig: pkg-config --variable=datadir coindatasample`\" else CXXFLAGS += -DSAMPLEDIR=\"\" CFLAGS += -DSAMPLEDIR=\"\" endif endif # Netlib data directory ifeq ($(COIN_HAS_NETLIB), TRUE) ifeq ($(COIN_HAS_PKGCONFIG), TRUE) CXXFLAGS += -DNETLIBDIR=\"`PKG_CONFIG_PATH=/home/martijn/Downloads/COIN/coin-Clp/lib64/pkgconfig:/home/martijn/Downloads/COIN/coin-Clp/lib/pkgconfig:/home/martijn/Downloads/COIN/coin-Clp/share/pkgconfig: pkg-config --variable=datadir coindatanetlib`\" CFLAGS += -DNETLIBDIR=\"`PKG_CONFIG_PATH=/home/martijn/Downloads/COIN/coin-Clp/lib64/pkgconfig:/home/martijn/Downloads/COIN/coin-Clp/lib/pkgconfig:/home/martijn/Downloads/COIN/coin-Clp/share/pkgconfig: pkg-config --variable=datadir coindatanetlib`\" else CXXFLAGS += -DNETLIBDIR=\"\" CFLAGS += -DNETLIBDIR=\"\" endif endif # Include directories (we use the CYGPATH_W variables to allow compilation with Windows compilers) ifeq ($(COIN_HAS_PKGCONFIG), TRUE) INCL = `PKG_CONFIG_PATH=/home/martijn/Downloads/COIN/coin-Clp/lib64/pkgconfig:/home/martijn/Downloads/COIN/coin-Clp/lib/pkgconfig:/home/martijn/Downloads/COIN/coin-Clp/share/pkgconfig: pkg-config --cflags clp` else INCL = endif INCL += $(ADDINCFLAGS) # Linker flags ifeq ($(COIN_HAS_PKGCONFIG), TRUE) LIBS = `PKG_CONFIG_PATH=/home/martijn/Downloads/COIN/coin-Clp/lib64/pkgconfig:/home/martijn/Downloads/COIN/coin-Clp/lib/pkgconfig:/home/martijn/Downloads/COIN/coin-Clp/share/pkgconfig: pkg-config --libs clp` else ifeq ($(COIN_CXX_IS_CL), TRUE) LIBS = -link -libpath:`$(CYGPATH_W) /home/martijn/Downloads/COIN/coin-Clp/lib` libClp.lib else LIBS = -L/home/martijn/Downloads/COIN/coin-Clp/lib -lClp endif endif # The following is necessary under cygwin, if native compilers are used CYGPATH_W = echo # Here we list all possible generated objects or executables to delete them CLEANFILES = clp \ main.o \ flexibility.o \ constraints.o \ all: $(EXE) .SUFFIXES: .cpp .c .o .obj $(EXE): $(OBJS) bla=;\ for file in $(OBJS); do bla="$$bla `$(CYGPATH_W) $$file`"; done; \ $(CXX) $(CXXLINKFLAGS) $(CXXFLAGS) -o $@ $$bla $(LIBS) $(ADDLIBS) clean: rm -rf $(CLEANFILES) .cpp.o: $(CXX) $(CXXFLAGS) $(INCL) -c -o $@ `test -f '$<' || echo '$(SRCDIR)/'`$< .cpp.obj: $(CXX) $(CXXFLAGS) $(INCL) -c -o $@ `if test -f '$<'; then $(CYGPATH_W) '$<'; else $(CYGPATH_W) '$(SRCDIR)/$<'; fi` .c.o: $(CC) $(CFLAGS) $(INCL) -c -o $@ `test -f '$<' || echo '$(SRCDIR)/'`$< .c.obj: $(CC) $(CFLAGS) $(INCL) -c -o $@ `if test -f '$<'; then $(CYGPATH_W) '$<'; else $(CYGPATH_W) '$(SRCDIR)/$<'; fi` The other Makefile compiles a lot of code and makes use of bison and flex. This one is also made by someone else. I am able to alter this Makefile when I want to add some code. This Makefile also makes a binary. CFLAGS=-Wall LDLIBS=-LC:/GnuWin32/lib -lfl -lm LSOURCES=lex.l YSOURCES=grammar.ypp CSOURCES=debug.cpp esta_plus.cpp heap.cpp main.cpp stjn.cpp timing.cpp tmsp.cpp token.cpp chaining.cpp flexibility.cpp exceptions.cpp HSOURCES=$(CSOURCES:.cpp=.h) includes.h OBJECTS=$(LSOURCES:.l=.o) $(YSOURCES:.ypp=.tab.o) $(CSOURCES:.cpp=.o) all: solver solver: CFLAGS+=-g -O0 -DDEBUG solver: $(OBJECTS) main.o debug.o g++ $(CFLAGS) -o $@ $^ $(LDLIBS) solver.release: CFLAGS+=-O5 solver.release: $(OBJECTS) main.o g++ $(CFLAGS) -o $@ $^ $(LDLIBS) %.o: %.cpp g++ -c $(CFLAGS) -o $@ $< lex.cpp: lex.l grammar.tab.cpp grammar.tab.hpp flex -o$@ $< %.tab.cpp %.tab.hpp: %.ypp bison --verbose -d $< ifneq ($(LSOURCES),) $(LSOURCES:.l=.cpp): $(YSOURCES:.y=.tab.h) endif -include $(OBJECTS:.o=.d) clean: rm -f $(OBJECTS) $(OBJECTS:.o=.d) $(YSOURCES:.ypp=.tab.cpp) $(YSOURCES:.ypp=.tab.hpp) $(YSOURCES:.ypp=.output) $(LSOURCES:.l=.cpp) solver solver.release 2>/dev/null .PHONY: all clean debug release Both of these Makefiles are, for me, hard to understand. I don't know what they exactly do. What I want is to merge the two of them so I get only one binary. The code compiled in the second Makefile should be the result. I want to add flexibility.cpp and constraints.cpp to the second Makefile, but when I do. I get the problem following problem: flexibility.h:4:26: fatal error: ClpSimplex.hpp: No such file or directory #include "ClpSimplex.hpp" So the compiler can't find the Clp library. I also tried to copy-paste more code from the first Makefile into the second, but it still gives me that same error. Q: Can you please help me with merging the two makefiles or pointing out a more elegant way? Q: In this case is it indeed better to merge the two Makefiles? I also tried to use cmake, but I gave upon that one quickly, because I don't know much about flex and bison.

    Read the article

  • Remote C++ Development using SSH only inside Eclipse Environment

    - by EFreak
    How do you integrate Remote Systems Explorer and CDT plugin inside eclipse ? What I mean is that you can use Remote Systems Explorer (RSE) plugin to work on C++ code on a remote linux box inside Eclipse but when you try to compile, you basically run a shell command through SSH. The CDT plugin is unable to locate the remote system and off course the remote compiler. Is there a way to integrate both the plugins so that we can use the parsing / suggestion features of CDT for the remote system as well; and also features like remote compilation, remote debugging using SSH only. If this is not possible, then what is the closest open source alternative to the above problem.

    Read the article

  • installing ruby 1.9.3 of osx mavricsmavrics

    - by user1648962
    I am trying to install ruby 1.9.3 on my osx 10.9 operating system and I keep getting the following error : Error running 'requirements_osx_port_update_system ruby-1.9.3-p448', please read /Users/ramesh/.rvm/log/1383430694_ruby-1.9.3-p448/update_system.log Requirements installation failed with status: 1. I am using the following command to do the installtion : rvm install 1.9.3 The complete log is as given below : Ramesh:Downloads ramesh$ rvm install 1.9.3 Searching for binary rubies, this might take some time. No binary rubies available for: osx/10.9/x86_64/ruby-1.9.3-p448. Continuing with compilation. Please read 'rvm help mount' to get more information on binary rubies. Checking requirements for osx. Installing requirements for osx. Updating system............................................................................................................................................................................................................................................................. Error running 'requirements_osx_port_update_system ruby-1.9.3-p448', please read /Users/ramesh/.rvm/log/1383430694_ruby-1.9.3-p448/update_system.log Requirements installation failed with status: 1. Ramesh:Downloads ramesh$ clear

    Read the article

  • Ant target for compile-time code instrumentation with Spring aspects

    - by alecswan
    I have developed a web application using Netbeans 6.7 and Ant. The webapp works, but I would like to refactor the code to use @Configurable Spring annotation for cleaner dependency injection. I was able to get load-time weaving (LTW) of Spring aspects to work intermittently (see http://forum.springsource.org/showthread.php?t=86904). At this point I would like to use compile-time weaving with my tool set. Could anybody provide an Ant target that I can use to weave Spring aspects at compile time? An extra credit will be given to anybody who explains how to configure Netbeans to execute the new Ant target right after code compilation. Thanks.

    Read the article

  • What are the real-world benefits of declarative-UI languages such as XAML and QML?

    - by Stu Mackellar
    I'm currently evaluating QtQuick (Qt User Interface Creation Kit) which will be released as part of Qt 4.7. QML is the JavaScript-based declarative language behind QtQuick. It seems to be a very powerful concept, but I'm wondering if anybody that's made extensive use of other, more mature declarative-UI languages like XAML in WPF or Silverlight can give any insight into the real-world benefits that can be gained from this style of programming. Various advantages are often cited: Speed of development Forces separation between presentation and logic Better integration between coders and designers UI changes don't require re-compilation Also, are there any downsides? A few potential areas of concern spring to mind: Execution speed Memory usage Added complexity Are there any other considerations that should be taken into account?

    Read the article

  • ExecutionEngineException thrown when loading native dll in c#

    - by Axarydax
    Hi there. I have a 32-bit .net application that uses a native 32-bit DLL via DllImport(). The native DLL is our internal file analysis library, and I need to use it as porting it to C# would be a problem if people update it (other software uses it). The problem is that when I try to execute any method in the native DLL I get a System.ExecutionEngineException thrown. In fact, I've reduced the managed application to a simple tester that just calls a native method, but it still fails. I am on 64-bit Windows 7, but that should not matter as I'm compiling everything as 32-bit binaries. What is also interesting, when I look at the native DLL in the Dependency Walker, it shows that it can't find msvcr90.dll - but when I open any other of our native DLLs in the Dependency Walker, it can find their referenced msvcr90.dll just fine. Can there by some wrongness in the compilation of native DLL that messss up its DLL references?

    Read the article

  • ExecutionEngineException thrown when loading native dll

    - by Axarydax
    I have a 32-bit .net application that uses a native 32-bit DLL via DllImport(). The native DLL is our internal file analysis library, and I need to use it as porting it to C# would be a problem if people update it (other software uses it). The problem is that when I try to execute any method in the native DLL I get a System.ExecutionEngineException thrown. In fact, I've reduced the managed application to a simple tester that just calls a native method, but it still fails. I am on 64-bit Windows 7, but that should not matter as I'm compiling everything as 32-bit binaries. What is also interesting, when I look at the native DLL in the Dependency Walker, it shows that it can't find msvcr90.dll - but when I open any other of our native DLLs in the Dependency Walker, it can find their referenced msvcr90.dll just fine. Can there by some wrongness in the compilation of native DLL that messes up its DLL references?

    Read the article

  • How do I get source file information with dumpbin /symbols when compiling with VS 2005?

    - by Thomas Dartsch
    I have a tool which uses the output of dumpbin /symbols to do some dependency analysis with our C/C++ libraries. When we compiled the libs with VS 6.0, the dumpbin COFF SYMBOL TABLE contained entries like 000 00000008 DEBUG notype Filename | .file x:\mydir\mysource.c allowing me to get the relationship between sources and defined/used symbols, which is essential for my tool. When we compile with VS 2005, these entries are missing. When I look at the libs with a hex editor, it seems that there is no filename information at all included in the binary files, so it seems not to be a dumbin problem but is compilation related. So I'm looking for a way to get the Filename entries back into my libraries when compiling with VS 2005.

    Read the article

  • adobe flash buider (flex4): addChild() is not available in this class.

    - by ufk
    Hi. I want to be able to load an swf into a flex 4 application in order to use it's classes. var ldr:Loader=new Loader(); ldr.load(new URLRequest("file://path/to/fileswf")); ldr.contentLoaderInfo. addEventListener(Event.INIT, loaded); function loaded(evt:Event):void { addChild(ldr); } i receive the error: Error: addChild() is not available in this class. Instead, use addElement() or modify the skin, if you have one. at spark.components.supportClasses::SkinnableComponent/addChild()[E:\dev\gumbo_beta2\frameworks\projects\spark\src\spark\components\supportClasses\SkinnableComponent.as:966] at main/private:init/loaded()[C:\Documents and Settings\ufk\Adobe Flash Builder Beta 2\xpogames-toolkit-test\src\main.mxml:22] if i change addChild() to addElement() i receive the following compilation error: 1067: Implicit coercion of a value of type flash.display:Loader to an unrelated type mx.core:IVisualElement. main.mxml path/dir line 22 Flex Problem any ideas how to resolve the issue ?

    Read the article

  • Compile 32bit mercurial on x86_64

    - by krashalot
    I'm using the academic version of EPD (Enthought Python Distribution) which is 32bit. My computer is Linux x86_64. platform.architecture() returns ('32bit','ELF') I want to install Mercurial. The instructions in README didn't work at first, because make gave this error: "LONG_BIT definition appears wrong for platform (bad gcc/glibc config?)." I commented out that line in pyport.h and then it compiled fine. Now, after successful compilation I get this error when running it: ImportError: /scratch/epd/lib/python2.6/site-packages/mercurial/osutil.so: wrong ELF class: ELFCLASS64 It appears that I compiled a 64bit version of hg, and it won't run with my 32bit python. I don't see any arch flags in the mercurial makefile. How can I force it to compile in 32bit mode?

    Read the article

  • How can I hide a database column in the entity model?

    - by Nick Butler
    Hi. I'm using the Entity Framework 4 and have a question: I have a password column in my database that I want to manage using custom SQL. So I don't want the model to know anything about it. I've tried deleting the property in the Mapping Details window, but then I got a compilation error: Error 3023: Problem in mapping fragments starting at line 1660:Column User.Password in table User must be mapped: It has no default value and is not nullable. So, I made the column nullable in the database and updated the model. Now I get this error: Error 3004: Problem in mapping fragments starting at line 1660:No mapping specified for properties User.Password, User.Salt in Set Users. An Entity with Key (PK) will not round-trip when: Entity is type [UserDirectoryModel.User] Any ideas please? Thanks, Nick

    Read the article

  • Replace LinkedList element value through LinkedList.Enumerator

    - by Yan Cheng CHEOK
    I realize there are no way for me to replace value through LinkedList.Enumerator. For instead, I try to port the below Java code to C# // Java ListIterator<Double> itr1 = linkedList1.listIterator(); ListIterator<Double> itr2 = linkedList2.listIterator(); while(itr1.hasNext() && itr2.hasNext()){ Double d = itr1.next() + itr2.next(); itr1.set(d); } // C# LinkedList<Double>.Enumerator itr1 = linkedList1.GetEnumerator(); LinkedList<Double>.Enumerator itr2 = linkedList2.GetEnumerator(); while(itr1.MoveNext() && itr2.MoveNext()){ Double d = itr1.Current + itr2.Current; // Opps. Compilation error! itr1.Current = d; } Any other technique I can use?

    Read the article

  • Visual Studio "Any CPU" target

    - by galets
    I have some confusion related to the .NET platform build options in VS 2008 Does anyone have a clear understanding what does "Any CPU" compilation target is and what sort of files it generates? I examined the output executable of this "Any CPU" build and found that they are (who would not see that coming!) the x86 executables. So, is there any the difference between targeting executable to x86 vs "Any CPU"? Another thing that I noticed, is that managed C++ projects do not have this platform as option. I'm wondering why is that. Does that mean that my suspicion about "Any CPU" executables being plain 32-bit ones is right?

    Read the article

  • calling Overloaded method from a generic method.

    - by asela38
    How to create a generic method which can call overloaded methods? I tried but it gives a compilation error. Test.java:19: incompatible types found : java.lang.Object required: T T newt = getCloneOf(t); ^ import java.util.*; public class Test { private Object getCloneOf(Object s) { return new Object(); } private String getCloneOf(String s) { return new String(s); } private <T> Set<T> getCloneOf(Set<T> set){ Set<T> newSet = null; if( null != set) { newSet = new HashSet<T>(); for (T t : set) { T newt = getCloneOf(t); newSet.add(newt); } } } }

    Read the article

  • yuicompressor error, not sure what is wrong?

    - by mrblah
    Hi, Very confused here, trying out the yuicompressor on a simple javascript file. My js file looks like: function splitText(text) { return text.split('-')[1]; } The error is: [INFO] Using charset Cp1252 [Error] 1:20:illegal character [Error] 1:20:syntax error [Error] 1:40:illegal character [Error] 1:49:missing ; before statement [Error] 1:50:illegal character .. .. [Error] 7:3:missing | in compound statement [error] 1:0:compilation produced 38 syntax errors ... Can someone please explain to me what is wrong?

    Read the article

  • Delphi's OTA: is there a way to get active configuration while building (D2010)?

    - by Alexander
    I can ask Delphi to build all configurations at once - by clicking on "Build configurations" and invoking "Make" command: This will build all configurations, one after another. The problem is that we have an IDE expert, which must react on compilation events. We register IOTAIDENotifier80 to hook events. There are BeforeBuild and AfterBuild events - we're interested in those. IOTAProject is passed to each event. The problem is: the active configuration is never changed. I.e. if you have "Debug" configuration selected (maked in bold) - all calls to BeforeBuild/AfterBuild events will return debug configuration profile (even though IDE compiles different profiles one after another). I mean properties of IOTAProject here. I also tried to use IOTAProjectOptionsConfigurations, but its ActiveConfiguration property always return the same "bolded" profile, regardless of current compiled one. The question is: is there a way to get the "real" current profile?

    Read the article

  • Supporting multiple versions without separate builds in JavaME

    - by Casebash
    I want to be able to support multiple versions of Java ME without having to have multiple builds. I already know how to detect the profile/configuration/supported JSRs. My problem is that knowing whether the JSR is supported at run time doesn't allow me to use all the features. For if I call a function added in a later version anywhere in the code - even a location that will never be run, then this will be a compilation error due to static typing. Is there any way round this?

    Read the article

  • C++ template partial specialization error

    - by JP19
    Hi, The following code is giving me a compilation error: class Q64 is not a valid type for a template constant parameter template<int GRIDD, class T> INLINE T grid_residue(T amount) { T rem = amount%(GRIDD); if (rem > GRIDD/2) rem -= GRIDD; return rem; } template<int GRIDD, Q64> INLINE Q64 grid_residue(Q64 amount) { return Q64(grid_residue<GRIDD, int64_t>(to_int(amount))); } Whats wrong? I am trying to specialize grid_residue for class Q64. thanks

    Read the article

  • Table and Column names causing problems

    - by craig
    I have an issue when the T4 linq templates generate the classes for my MySql db using subsonic 3. It looks like one of our table names "operator" is causing problems in the Context.cs generated class. In the following line of code in Context.cs Visual Studio sees <operator> as a c# operator and generates a compilation error of "Type expected" public Query<operator> operators { get; set; } Is there anyway I can work around this without having to rename my database table and column names? For example hard coding something in Settings.ttinclude to use or map different names to specific db tables and columns?

    Read the article

  • C# ASP.Net The type or namespace name 'Secure' does not exist in the namespace 'source_extranet'

    - by Louis Russell
    Afternoon all, I really need your help as this is a Nightmare! I was earlier having a problem with referencing a 3rd Party Dll (Here) but have overcome this problem and am now having a problem referencing my own classes! Everything seems fine at build with no errors at all but when I go to run the application it comes up with the following Compilation Error: Compiler Error Message: CS0234: The type or namespace name 'Secure' does not exist in the namespace 'source_extranet' (are you missing an assembly reference?) The line that the Error points to this line in the class: source_extranet.Secure.BackendCustomData newdata = new source_extranet.Secure.BackendCustomData(); This line of code points to a class in the same folder as the calling code class. I have scoured Google looking for an answer and haven't found anything that points me to an answer to my problem. Any help would be AMAZING!

    Read the article

  • Running Perl Scripts on servers that don't have the modules

    - by envinyater
    I need to run a perl script to gather system information that will be deployed and executed on different unix servers. Right now I am writing it and testing it, and I'm receiving this error. Can't locate XML/DOM.pm in @INC (@INC contains: /usr/local/lib64/perl5 /usr/local/share/perl5 /usr/lib64/perl5/vendor_perl /usr/share/perl5/vendor_perl /usr/lib64/perl5 /usr/share/perl5 .) at test.pl line 7. BEGIN failed--compilation aborted at test.pl line 7. So I am simply using XML::DOM which should be part of Perl but it isn't for this version on this particular server which is 5.10.1. Anyways, is there a way I can create and design my script and package modules into it while keeping the .pl extension, which is the requirement for this script?

    Read the article

  • Clearing "may not respond" warnings for UIView and UIViewController

    - by user284681
    In an iPad app, I'm using a custom subclass of UIView with UIViewController. Here's the view header: @interface pdfView : UIView { CGPDFDocumentRef doc; } -(void)setDoc:(CGPDFDocumentRef)newDoc; @end And here's the controller header: @interface iPadPDFTestViewController : UIViewController { CGPDFDocumentRef doc; } - (void)loadPDF; @end Part of the controller implementation: - (void)viewDidLoad { [super viewDidLoad]; [self loadPDF]; [self.view setDoc:doc]; } In Interface Builder, I've set the view object to use the class pdfView. At compilation, [self.view setDoc:doc]; gives the warning "'UIView' may not respond to '--setDoc'." I'm guessing that this warning appears because the compiler thinks it's looking at UIView (which does not implement the setDoc method) instead of pdfView. But why does it think that? And how can I tell it what class it's really looking at, so as to clear the warning?

    Read the article

  • Cannot load JRubyEngine because org.apache.bsf.util.BSFEngineImpl not found

    - by Ceilingfish
    Hi, I'm trying to use JRuby in a custom application, and I don't seem to be able to load the JRubyEngine object. My class looks like functionally similar to this: public class ScriptEngine { private static ScriptEngine engine = new JRubyEngine(); public void run(final String script, final Map<String,Object> input) { final Bindings context = engine.createBindings(); context.putAll(input); try { engine.eval(script,context); } catch (ScriptException e) { log.error("Failed to execute script: "+getScript(),e); } } } However this fails at compilation with the complaint: [javac] Compiling 486 source files to /workspace/myProject/build/src [javac] /workspace/myProject/src/net/ceilingfish/ScriptEngine.java:31: cannot access org.apache.bsf.util.BSFEngineImpl [javac] class file for org.apache.bsf.util.BSFEngineImpl not found [javac] private static ScriptEngine engine = new JRubyEngine(); [javac] ^ [javac] 1 error Does anyone have any insights on where I can get this class from? Or if there is a better way to be instantiating a JRubyEngine object.

    Read the article

  • mingw32-make : "Input line too long" issue

    - by hjsblogger
    We have a Makefile which runs on a Windows 2003 machine and we are using mingw32-make for it. Since the Makefile has many include paths, it exceeds the buffer size of 8K that the cmd can handle [Ref - http://support.microsoft.com/kb/830473/EN-US/due to which the compilation results in "input line too long" issue. I wanted to know the following - What would be the best way to optimize the Makefile as I have already removed unwanted compiler switches, include paths etc. Is there any way we can put all the INCLUDE paths in one .txt file and import it in the Makefile.I could not find any mingw32 documentation for the same. Any other inputs/hints/reference links are most welcome. Thanks, -HJSblogger

    Read the article

  • How to run unit tests with DSSS and GDC?

    - by Benoit Vidis
    I am very new to D and still battling trying to configure my toolchain. I am running Ubuntu Karmic and would like to use DSSS with GDC and Tango or TangoBos. Till now, I installed GDC from Ubuntu repositories, DSSS, Tango and TangoBos from these repositories and I can compile using dsss + gdc + tangobos. According to DSSS documentation, it should be possible to run the unit tests using $ dsss build --test but on my system, the --test argument is ignored. I have dsss last version (0.78) and its inline help does not include anything about unit tests. Running ldc --unittest works fine (though I do not know exactly which libray it picks up). Is there a way to run my unit tests using the same compiler & library than for compilation? If so, is there a way to automate the testing or will I have to run it module per module?

    Read the article

< Previous Page | 114 115 116 117 118 119 120 121 122 123 124 125  | Next Page >