Search Results

Search found 6159 results on 247 pages for 'compile'.

Page 123/247 | < Previous Page | 119 120 121 122 123 124 125 126 127 128 129 130  | Next Page >

  • Using OData to get Mix10 files

    - by Jon Dalberg
    There has been a lot of talk around OData lately (go to odata.org for more information) and I wanted to get all the videos from Mix ‘10: two great tastes that taste great together. Luckily, Mix has exposed the ‘10 sessions via OData at http://api.visitmix.com/OData.svc, now all I have to do is slap together a bit of code to fetch the videos. Step 1 (cut a hole in the box) Create a new console application and add a new service reference. Step 2 (put your junk in the box) Write a smidgen of code: 1: static void Main(string[] args) 2: { 3: var mix = new Mix.EventEntities(new Uri("http://api.visitmix.com/OData.svc")); 4:   5: var files = from f in mix.Files 6: where f.TypeName == "WMV" 7: select f; 8:   9: var web = new WebClient(); 10: 11: var myVideos = Path.Combine(Environment.GetFolderPath(Environment.SpecialFolder.MyVideos), "Mix10"); 12:   13: Directory.CreateDirectory(myVideos); 14:   15: files.ToList().ForEach(f => { 16: var fileName = new Uri(f.Url).Segments.Last(); 17: Console.WriteLine(f.Url); 18: web.DownloadFile(f.Url, Path.Combine(myVideos, fileName)); 19: }); 20: } Step 3 (have her open the box) Compile and run. As you can see, the client reference created for the OData service handles almost everything for me. Yeah, I know there is some batch file to download the files, but it relies on cUrl being on the machine – and I wanted an excuse to work with an OData service. Enjoy!

    Read the article

  • Continuous integration never results in build errors

    - by Jon
    Hi, I'm working with a variety of Java EE websites which use internal libraries we've developed. For each website, we only upgrade to new versions of our internal libraries as needed, and before committing we make sure that the site compiles fine. What this means is that when TeamCity does a build of one of our sites, the site compiles fine, but later when the site is updated to the latest version of internal libraries, there might be a compile error. Is there a good way to handle this? We're not using Maven yet; would using Maven mean that our websites could automatically use the latest version of internal libraries? Thanks. Clarification: What we sometimes run into is this: Project A depends on a library, and is currently using library version 1.0 Project B also depends on that library. I make changes to the library so that it is now version 1.5. Project B now uses 1.5. Project A and project B have both been built just fine by the CI server (TeamCity) Working on project A again, I update to 1.5 and discover that 1.5 has breaking changes in it. Is there a way for the CI server to discover these kinds of breaking changes?

    Read the article

  • How to achieve a loosely coupled REST API but with a defined and well understood contract?

    - by BestPractices
    I am new to REST and am struggling to understand how one would properly design a REST system to both allow for loose coupling but at the same time allow a consumer of a REST API to understand the API. If, in my client code, I issue a GET request for a resource and get back XML, how do I know what to do with that xml? e.g. if it contains <fname>John</fname><lname>Smith</lname> how do I know that these refer to the concept of "first name", "last name"? Is it up to the person writing the REST API to define in documentation some place what each of the XML fields mean? What if producer of the API wants to change the implementation to be <firstname> instead of <fname>? How do they do this and notify their consumers that this change occurred? Or do the consumers just encounter the error and then look at the payload and figure out on their own that it changed? I've read in REST in Practice that using a WADL tool to create a client implementation based on the WADL (and hide the fact that you're doing a distributed call) is an "anti-pattern". But I was planning to do this-- at least then I would have a statically typed API call that, if it changed, I would know at compile time and not at run time. Why is this a bad thing to generate client code based on a WADL? And how do I know what to do with the links that returned in the response of a POST to a REST API? What defines this contract and gives true meaning to what each link will do? Please help! I dont understand how to go from statically-typed or even SOAP/RPC to REST!

    Read the article

  • C++11 Tidbits: access control under SFINAE conditions

    - by Paolo Carlini
    Lately I have been spending quite a bit of time on the SFINAE ("Substitution failure is not an error") features of C++, fixing and tweaking various bits of the GCC implementation. An important missing piece was the implementation of the resolution of DR 1170 which, in a nutshell, mandates that access checking is done as part of the substitution process. Consider: class C { typedef int type; }; template <class T, class = typename T::type> auto f(int) - char; template <class> auto f(...) -> char (&)[2]; static_assert (sizeof(f<C>(0)) == 2, "Ouch"); According to the resolution, the static_assert should not fire, and the snippet should compile successfully. The reason being that the first f overload must be removed from the candidate set because C::type is private to C. On the other hand, before the resolution of DR 1170, the expected behavior was for the first overload to remain in the candidate set, win over the second one, to eventually lead to an access control error (*). GCC mainline (would be 4.8) finally implements the DR, thus benefiting the many modern programming techniques heavily exploiting SFINAE, among which certainly the GNU C++ runtime library itself, which relies on it for the internals of <type_traits> and in several other places. Note that the resolution of the DR is active even in C++98 mode, not just in C++11 mode, because it turned out that the traditional behavior, as implemented in GCC, wasn't fully consistent in all the possible circumstances. (*) In practice, GCC didn't really implement this, the static_assert triggered instead.

    Read the article

  • please help me understand libraries/includes

    - by fiftyeight
    I'm trying to understand how libraries work. for example I downloaded a tarball and extracted it. Now I do "./configure", it searches in pre-defined directories from what I understand for certain library files. What does it do then? it creates a makefile, and the makefile contains the paths to these libraries? than I do "make", it complies the source code and hard-codes the locations of the libraries? am I correct? I do not really understand if libraries are files with pre-defined paths or the OS somehow gives access to the libraries through system calls. another example, I complied something on my computer than moved it to a remote server, the executable needs mysql libraries to work, the server has mysql but for some reason when execute the file it tells me "can't find libmysqlclient.so.16". is there a solution for this? is there a way to know where is tries to locate this file or give it another path? I can't compile it on the server since it has no compiler and I don't have root access to install packages last question is if in the sequence "./configure","make","make install" the "make install" command is the only one that actually puts files outside the directory in which these files reside? if for example the software will be installed in /usr/local/ is the "make install" command the only one that will require "sudo" before it? let me see if I got it correctly: "./configure" creates the Makefile according to the location of various files on your system. "make" compiles the source code according to this makefile. and "make install" moves the files to their appropriate location. I know this has been very long I thank anyone who had the patience to read my question :)

    Read the article

  • Better way to generate enemies of different sub-classes

    - by KDiTraglia
    So lets pretend I have an enemy class that has some generic implementation and inheriting from it I have all the specific enemies of my game. There are points in my code that I need to check whether an enemy is a specific type, but in Java I have found no easier way than this monstrosity... //Must be a better way to do this if ( enemy.class.isAssignableFrom(Ninja.class) ) { ... } My partner on the project saw these and changed them to use an enum system instead public class Ninja extends Enemy { //EnemyType is an enum containing all our enemy types public EnemyType = EnemyTypes.NINJA; } if (enemy.EnemyType = EnemyTypes.NINJA) { ... } I also have found no way to generate enemies on varying probabilities besides this for (EnemyTypes types : enemyTypes) { if ( (randomNext = (randomNext - types.getFrequency())) < 0 ) { enemy = createEnemy(types.getEnemyType()); break; } } private static Enemy createEnemy(EnemyType type) { switch (type) { case NINJA: return new Ninja(new Vector2D(rand.nextInt(getScreenWidth()), 0), determineSpeed()); case GORILLA: return new Gorilla(new Vector2D(rand.nextInt(getScreenWidth()), 0), determineSpeed()); case TREX: return new TRex(new Vector2D(rand.nextInt(getScreenWidth()), 0), determineSpeed()); //etc } return null } I know java is a little weak at dynamic object creation, but is there a better way to implement this in a way such like this for (EnemyTypes types : enemyTypes) { if ( (randomNext = (randomNext - types.getFrequency())) < 0 ) { //Change enemyTypes to hold the classes of the enemies I can spawn enemy = types.getEnemyType().class.newInstance() break; } } Is the above possible? How would I declare enemyTypes to hold the classes if so? Everything I have tried so far as generated compile errors and general frustration, but I figured I might ask here before I completely give up to the huge mass that is the createEveryEnemy() method. All the enemies do inherit from the Enemy class (which is what the enemy variable is declared as). Also is there a better way to check which type a particular enemy that is shorter than enemy.class.isAssignableFrom(Ninja.class)? I'd like to ditch the enums entirely if possible, since they seem repetitive when the class name itself holds that information.

    Read the article

  • Determinism in multiplayer simulation with Box2D, and single computer

    - by Jake
    I wrote a small test car driving multiplayer game with Box2D using TCP server-client communcations. I ran 1 instance of server.exe and 2 instance of client.exe on the same machine that I code and compile the executables. I type inputs (WASD for a simple car movement) into one of the 2 clients and I can get both clients to update the simulation. There are 2 cars in the simulation. As long as the cars do not collide, I get the same identical output on both client.exe. I can run the car(s) around for as long as I could they still update the same. However, if I start to collide the cars, very quickly they go out of sync. My tools: Windows 7, C++, MSVS 2010, Box2D, freeGlut. My Psuedocode: // client.exe void timer(int value) { tcpServer.send(my_inputs); foreach(i = player including myself) inputs[i] = tcpServer.receive(); foreach(i = player including myself) players[i].process(inputs[i]); myb2World.step(33, 8, 6); // Box2D world step simulation foreach(i = player including myself) renderer.render(player[i]); glutTimerFunc(33, timer, 0); } // server.exe void serviceloop { while(all clients alive) { foreach(c = clients) tcpClients[c].receive(&inputs[c]); // send input of each client to all clients foreach(source = clients) { foreach(dest = clients) { tcpClients[dest].send(inputs[source]); } } } } I have read all over the internet and SE the following claims (paraphrased): Box2D is deterministic as long as floating point architecture/implementation is the same. (For any deterministic engine) Determinism is gauranteed if playback of recorded inputs is on the same machine with exe compiled using same compiler and machine. Additionally my server.exe and client.exe gameloop is single thread with blocking socket calls and fixed time step. Question: Can anyone explain what I did wrong to get different Box2D output?

    Read the article

  • Trying to make ZeroRadiant work on Lubuntu 11.10

    - by maniat1k
    I'm looking to use ZeroRadiant to work on ubuntu, it's to create maps on urban terror, I know that works on windows and mac. I found this HOWTO but could not make it work. I've got stock on this when I do. ~/ZeroRadiant-src$ scons target=radiant,q3map2 config=release at the end shows me this: collect2: ld returned 1 exit status scons: *** [build/release/radiant/radiant.bin] Error 1 scons: building terminated because of errors. there's a note on the how to that says: If you get other errors, you may try asking for help in the GtkRadiant IRC channel, which is listed on the main ZeroRadiant page. the page does not there and I could find that IRC channel. EDIT thanks to @jokerdino in the chat sow me this If you have the proprietary NVIDIA driver installed and you get the following error when executing the build target above: /usr/bin/ld: cannot find -lGL collect2: ld returned 1 exit status scons: *** [build/release/radiant/radiant.bin] Error 1 scons: building terminated because of errors. then you should install one of the nvidia-glx-dev* packages as mentioned above in Step A, then try to execute the main build target to compile GtkRadiant again. I don't have nvidia; I do have ATI AMD Radeon HD 6320, and looks like it works. GL_VERSION: 4.1.11251 Compatibility Profile Context GL_VENDOR: ATI Technologies Inc. GL_RENDERER: AMD Radeon HD 6320 Graphics I'm think I do have video issues but.. how do I detect this? how can I continue?

    Read the article

  • C printf not working on ubuntu 13.10 terminal

    - by Blaze Tama
    First, im new in ubuntu so please bear with me. I need to create a C based program for a course in my university. I was using opensuse and its konsole when i was in the university's lab. So basically i need to install opensuse on my system or using a vmware. But i feel lazy to do that, so i tried to run it on my ubuntu instead of opensuse. However, no C code seems working on ubuntu's terminal. The compiling is success, but its not running, or at least the printf is not running. This is my code, a very very simple one : #include<stdio.h> #include<unistd.h> #include<stdlib.h> int main() { printf("test"); return 0; } When i compile it with gcc test.c -o test everything work fine and i get an executeable file. Then i try to run it by ./test, but the printf is not printed. No error or warning was shown. Am i missing something? Note : my gcc is the new one, it should has no problem. Thanks for your help :D

    Read the article

  • Workaround for an Xcode/iOS SDK Issue...

    - by Joe Huang
    Hi, everyone: When you are doing ADF Mobile development, and you need to deploy the application to an iOS device, you would need to compile/deploy the app with iOS App Certificates and Provisioning Profile. This means you would need to "Deploy to Package" or "Deploy to iTunes" during deployment, and configure JDeveloper with the proper certificates/profiles. In some instances (exact combination is still not clear), deploy and signing the application to generate the ipa file may fail with similar error message at the end of the deployment log: [01:04:45 PM] Deployment failed due to one or more errors returned by '/usr/bin/xcrun'. The following is a summary of the returned error(s): Command-line execution failed (Return code: 1) error: /usr/bin/codesign --force --preserve-metadata=identifier,entitlements,resource-rules --sign iPhone Distribution: Oracle Corporation --resource-rules=/var/folders/x7/21sjrpx13qj9tq20z14s3j_w0000gn/T/tkROhP11qU/Payload/HelloWorld.app/ResourceRules.plist --entitlements /var/folders/x7/21sjrpx13qj9tq20z14s3j_w0000gn/T/tkROhP11qU/entitlements_plistEINPBkIG /var/folders/x7/21sjrpx13qj9tq20z14s3j_w0000gn/T/tkROhP11qU/Payload/HelloWorld.app failed with error 1. Output: /var/folders/x7/21sjrpx13qj9tq20z14s3j_w0000gn/T/tkROhP11qU/Payload/HelloWorld.app: replacing existing signature Program /usr/bin/codesign returned 1 : [/var/folders/x7/21sjrpx13qj9tq20z14s3j_w0000gn/T/tkROhP11qU/Payload/HelloWorld.app: replacing existing signature This issue is a known issue and is not related to ADF Mobile. The workaround is discussed in this article from StackOverflow. This article refers to the old location of Xcode, so you would need to adjust the paths accordingly. The path for Xcode 4.3 and above would be like: /Applications/Xcode.app/Contents//Developer/Platforms/iPhoneOS.platform/Developer/usr/bin/PackageApplication to this script file. To modify it, you probably can’t use Text Editor. I end up opening a terminal session, changed the file permission, and used vi to update it. Thanks, Oracle ADF Mobile Product Management Team

    Read the article

  • Java Applet or Unity3D for Cross-Platform 3D Surveying App

    - by Jake M
    Do you think a Java Applet or Unity3D Application is the best option to make a cross-browser 3d web-app? I intend to make a web application that displays 3d environments that can be navigated by dragging(with a finger or mouse depending on the platform). The web app will render 3d environments of development sites including contours, water pipeline locations, buildings etc. The application must work on Windows Desktop, Android, iOS and Windows Phone. So this is why I am tending towards a web-app as opposed to cross-platform smart phone library(like Mosync or Marmalade). The 3d environments will be navigable(by dragging around) and contain simple(not detailed) 3d objects like buildings, mountains, pipelines, etc. One thing I know is that WebGL is out because it doesn't work on IE and has limited support on Smart Phones(am I correct to completely disregard WebGL?). Will future Smart Phone browsers continue to support Java Applets? Also is it really true I can write ONE Application/Game in Unity3D and simply compile it to run on Windows Windows, Mac, Xbox 360, PlayStation 3, Wii, iPad, iPhone and Android? Would you suggest the Unity3D application path or the Unity3D Web Player path? Concerning Unity3D, there's one thing I am unsure about: do all Unity3D features work on iOS and Android?

    Read the article

  • Why does DataContractJsonSerializer not include generic like JavaScriptSerializer?

    - by Patrick Magee
    So the JavaScriptSerializer was deprecated in favor of the DataContractJsonSerializer. var client = new WebClient(); var json = await client.DownloadStringTaskAsync(url); // http://example.com/api/people/1 // Deprecated, but clean looking and generally fits in nicely with // other code in my app domain that makes use of generics var serializer = new JavaScriptSerializer(); Person p = serializer.Deserialize<Person>(json); // Now have to make use of ugly typeof to get the Type when I // already know the Type at compile type. Why no Generic type T? var serializer = new DataContractJsonSerializer(typeof(Person)); Person p = serializer.ReadObject(json) as Person; The JavaScriptSerializer is nice and allows you to deserialize using a type of T generic in the function name. Understandably, it's been deprecated for good reason, with the DataContractJsonSerializer, you can decorate your Type to be deserialized with various things so it isn't so brittle like the JavaScriptSerializer, for example [DataMember(name = "personName")] public string Name { get; set; } Is there a particular reason why they decided to only allow users to pass in the Type? Type type = typeof(Person); var serializer = new DataContractJsonSerializer(type); Person p = serializer.ReadObject(json) as Person; Why not this? var serializer = new DataContractJsonSerializer(); Person p = serializer.ReadObject<Person>(json); They can still use reflection with the DataContract decorated attributes based on the T that I've specified on the .ReadObject<T>(json)

    Read the article

  • How to present a stable data model in a public API that allows internal data structures to be changed without breaking the public view of the data?

    - by Max Palmer
    I am in the process of developing an application that allows users to write C# scripts. These scripts allow users to call selected methods and to access and manipulate data in a document. This works well, however, in the development version, scripts access the document's (internal) data structures directly. This means that if we were to change the internal data model/structure, there is a good chance that someone's script will no longer compile. We obviously want to prevent this breaking change from happening, but still want to allow the user to write sensible C# code (whilst not restricting how we develop our internal data model as a result). We therefore need to decouple our scripting API and its data structures from our internal methods and data structures. We've a few ideas as to how we might allow the user to access a what is effectively a stable public version of the document's internal data*, but I wanted to throw the question out there to someone who might have some real experience of this problem. NB our internal document's data structure is quite complex and it could be quite difficult to wrap. We know we want to expose as little as possible in our public API, especially as once it's out there, it's out there for good. Can anyone help? How do scripting languages / APIs decouple their public API and data structures from their internal data structures? Is there no real alternative to having to write a complex interaction layer? If we need to do this, what's a good approach or pattern for wrapping complex data structures that include nested objects, including collections? I've looked at the API facade pattern, which looks like it's trying to address these kinds of issues, but are there alternatives? *One idea is to build a data facade that is kept stable across versions of our application. The facade exposes a set of facade data objects that are used in the script code. These maintain backwards compatibility and wrap access to our internal document's data model.

    Read the article

  • SubMain Ghost Doc Pro with SpellChecking

    - by TATWORTH
    SubMain have announced at http://community.submain.com/forums/2/1556/ShowThread.aspx#1556 that the next version of GhostDoc will include a VS2005/VS2008/VS2010 compatible spell checker. This replaces their existing spellchecker (http://submain.com/products/codespell.aspx)  which is being discontinued. If you buy GhostDoc Pro now (I urge you to as it helps tremendously in documenting both C# and VB.NET code) , be sure to include Licence Protection as it means you will get the next version that includes the spell-checker free! Why is a spell checker important? By spell checking all your comments, you will make your documentation much easier to read. This means that instead of you being distracted by typographic errors, your mind will be free to see errors in what has been written. Remember the next person that has to struggle to read your code could well be yourself! So be kind to your self. Do the following: Document whole source files in VB.NET of C# with GhostDoc Pro Run Stylecop and fix the issues it uncovers. Run the spellchecker (when it is available) Add remarks where necessary Specify in the project to produce XML documentation Compile the XML using Sandcastle to help files Review the help files and ask yourself if the explanations are sufficient.

    Read the article

  • What partition to use to keep data files in Ubuntu?

    - by Martin Lee
    I have been using Ubuntu for a few years and usually my partition set up was the following: Ext3 or Ext4 partition for the system itself (20 GB); A 10 GB swap partition; a big FAT32 partition to store movies, photos, work stuff, etc. (depends on the capacity of the disk, but usually it is what is left from Ext3+Swap, currently it is more than 200 GB). Does this setup sound right? I am considering to switching to one big Ext3 partition now, because the problem with Fat32 in Ubuntu has not gone anywhere: for example, right now I can access my 'big' partition with a 'Data' label only through /media/_themes?END. Pretty strange name for a partition, isn't it? some Linux software fail to read/write on this partition. For example, if I want to play around with rebar and build/make/compile things on this FAT32 partition, it will always complain about permissions and won't work (the same goes for many other kinds of software); it is not stable, I can not refer to some files on this FAT32 partition, because after the next reboot it will be called not '_themes?END', but something else. On the other side I usually begin to run out of space on the Ext3 partition after a few months of usage. So, the question is - what is the best setup of partitions for an Ubuntu system? Should a FAT32 partition be used at all?

    Read the article

  • Boot failure randomly. Can someone help?

    - by desgua
    I often get stuck at the boot on battery. I can suspend and hibernate without any trouble (even on battery). If I plug the energy cable everything goes right most of the time (edited June 13). Adding "acpi=off" doesn't solve the issue. Disabling power save from LAN at BIOS doesn't solve this too. Memory test seems to be ok: I can even compile a kernel without a problem: Has someone a tip for this? (I have the same questions marks in my mind as the image shows if not more :-/ ) obs.: this is an upgraded installation from Natty alpha. edited I (May 22): I've installed a brand new final 11.04 and got the bug again. edited II (June 13): I thought it was solved but it is not. After spending eight days off, the same problem happened even with the power cable connected. I had to reboot about 6 times until success. edited III (June 14): I can boot (even with battery) if I disconnect the cable and take off the battery for a few seconds. This lead me to conclude that maybe something is kept into memory of some hardware. Maybe or maybe not related but I also have touchpad issues (jumps) with this machine. edited IV (June 25): I opened gconf-editor and went to apps gnome-power-manager and disabled everything possibly related to suspend or hibernate. No help. Also I grabbed a Fedora live usb pendrive and got the same problem: So I think it is a hardware problem related issue.

    Read the article

  • Why doesn't my IDE do background compiling/building?

    - by MKO
    Today I develop on a fairly complex computer, it has multiple cores, SSD drives and what not. Still, most of the time I'm programming the computer is leasurely doing nothing. When I need to compile and run/deploy a somewhat complex project at best it still takes a couple of seconds. Why? Now that we're living more and more in the "age of instant" why can't I press F5 in Visual studio and launch/deploy my application instantly? A couple of seconds might not sound so bad but it's still cognitive friction and time that adds up, and frankly it makes programming less fun. So how could compilation be instant? Well, People tend to edit files in different assemblies, what if Visual Studio/The IDE constantly did compilation/and building of everything that I modified anytime that it might be appropriate. Heck if they wanted to go really advanced they could do per-class compilation. The compilation might not work but then it could just silently do nothing (except adding error messages to the error window). Surely todays computer could dedicate a core or two to this task, and if someone found it annoying it could be disabled by option. I know there's probably a thousand technical issues and some fancy shadow copying that would need to be resolved for this to be seamless and practical but it sure would make programming more seamless. Is there any practical reason why this scenario isn't possible? Would the wear and tear of continually writing binaries be too much? Couldn't assemblies be held in memory until deployed/run?

    Read the article

  • How to do Cross Platform in own Engine? [on hold]

    - by Mineorbit
    At the Moment I finished the first game with my game engine(if I wanna call it like that) which is based in LWJGL. Now i'm worring if I could do crossplattforming in my engine. I build me a tool tool with a batch file to compile my project dir into an .exe . At first i'm looking to do it on Android with an comparable batch file. An link for an tutorial would be awesome! At next place there would be an renderer and audiosystem. If read that theres an OpenGL ES renderer, and I allready played a bit around with the Android SDK. But I use the Texture and Audio class in slick-util. So I thought about creating OOP classes that carry around the data and load it in an platform specific Buffer. A Link for an equaly easy-to-use Texture or Audio class would be awesome! Thats all for now! Answers would be awesome! Thanks, Mineorbit!

    Read the article

  • Troubleshooting VMware on Ubuntu

    Summary of different problems while using VMware products on Ubuntu. This article is going to be updated from time to time with new information about running VMware products more or less smoothly on Ubuntu. Following are links to existing articles: Running VMware Player on Linux (xubuntu Hardy Heron) Running VMware Server on Linux (version 1.0.6 on xubuntu) Using ext4 in VMware machine   VMware mouse grab/ungrab problem (Source: LinuxInsight) Upgrading GTK library in Ubuntu since Karmic Koala gives you a strange mouse behaviour. Even if you have "Grab when cursor enters window" option set, VMware won't grab your pointer when you move mouse into the VMware window. Also, if you use Ctrl-G to capture the pointer, VMware window will release it as soon as you move mouse around a little bit. Quite annoying behavior... Fortunately, there's a simple workaround that can fix things until VMware resolves incompatibilities with the new GTK library. VMware Workstation ships with many standard libraries including libgtk, so the only thing you need to do is to force it to use it's own versions. The simplest way to do that is to add the following line to the end of the /etc/vmware/bootstrap configuration file and restart the Workstation. export VMWARE_USE_SHIPPED_GTK="force" The interface will look slightly odd, because older version of GTK is being used, but at least it will work properly. Note: After upgrading a new Linux kernel, it is necessary to compile the VMware modules, this requires to temporarily comment the export line in /etc/vmware/bootstrap.

    Read the article

  • How to build the mainline kernel source package?

    - by Maxime R.
    Ubuntu kernel PPA only provides linux-headers*.deb and linux-image*.deb packages. How can I build the corresponding linux-source*.deb package ? Context: I'm currently running Ubuntu 11.10 with the mainline kernel (3.2 rc6 now) to get a better support for my sandybridge IGP (Dell E6420 laptop with intel i5-2520M CPU). Appears, i'd like to install this touchpad driver, ALPS touchpads being badly supported (see previous link bug report), while waiting for upstream support in kernel version 3.3. Problem is, DKMS keeps complaining about not finding the full kernel source: Module build for the currently running kernel was skipped since the kernel source for this kernel does not seem to be installed. Appears I may not need the full source but I'd still like to try having it installed to see if it solve my problem. What I tried : Uncompressing the kernel.org source archive in /usr/src/. DKMS still complaining. Manually updating the kernel source package with uupdate and the mainline source package like explained here. Did not succeed. Manually building the linux-source package following @roadmr and @elmicha instructions. I eventually succeeded to build it but DKMS still complained about the missing source. At last I noticed an error I did not catch in the first place while reinstalling the kernel headers. Appears the .deb I got may have been corrupted, downloading it again did the trick :) Alas, while DKMS agreed to compile the module i ran into the following error which appears to have already been reported. This issue isn't yet solved but I won't try to because of the following: in the end I decided to test the precise kernel version 3.2-rc6 through the xorg-edgers ppa which appears to be correctly patched: it works. Nevertheless, it might still be of some interest to know how to build the mainline linux-source package as the Ubuntu Kernel Team doesn't provide it. Not to mention that I learned a lot in the process ^^

    Read the article

  • Why have minimal user/handwritten code and do everything in XAML?

    - by Mrk Mnl
    I feel the MVVM community has become overzealous like the OO programmers in the 90's - it is a misnomer MVVM is synonymous with no code. From my closed StackOverflow question: Many times I come across posts here about someone trying to do the equivalent in XAML instead of code behind. Their only reason being they want to keep their code behind 'clean'. Correct me if I am wrong, but is not the case that: XAML is compiled too - into BAML - then at runtime has to be parsed into code anyway. XAML can potentially have more runtime bugs as they will not be picked up by the compiler at compile time - from incorrect spellings - these bugs are also harder to debug. There already is code behind - like it or not InitializeComponent(); has to be run and the .g.i.cs file it is in contains a bunch of code though it may be hidden. Is it purely psychological? I suspect it is developers who come from a web background and like markup as opposed to code. EDIT: I don't propose code behind instead of XAML - use both - I prefer to do my binding in XAML too - I am just against making every effort to avoid writing code behind esp in a WPF app - it should be a fusion of both to get the most out of it. UPDATE: Its not even Microsoft's idea, every example on MSDN shows how you can do it in both.

    Read the article

  • Keeping files that are often changed in sync between desktop and laptop

    - by N.N.
    I'm looking for a way to keep a desktop and a laptop in sync. What I want to keep in sync are some folders, mainly ~/Documents, that are changed often when working on them. If it matters I can connect to my desktop from anywhere via an URL but my laptop is harder to access since it might be behind NAT and such. I have been looking at Ubuntu One but it seems to not go well with working on documents written in LaTeX. If I work on a .tex file in the Ubuntu One directory and compile it (with pdflatex) every now and then (as often as every 10 sec when working) it will create several new files including a pdf which are uploaded to Ubuntu One and this seems stupid since it will create continuous upload when working on .tex files. I also usually keep .tex documents version controlled by git and then every commit (which also can happen frequently) will cause upload (by changes in ./.git) so that it happens continuously when working. Another example is editing images that are saved often. What I think would be best is for sync to happen every tenth minute or at the end of every working session (but there might be some other way to handle this?).

    Read the article

  • CodePlex Daily Summary for Wednesday, June 26, 2013

    CodePlex Daily Summary for Wednesday, June 26, 2013Popular ReleasesNaked Objects: Naked Objects Release 5.5.0: This release includes a number of significant improvements to the usability of the UI, some of which involve new programming conventions or attributes: Action dialogs now appear as pop-up modal dialogs instead of as a new page; query-only actions have an Apply as well as an OK button. See https://nakedobjects.codeplex.com/workitem/175 When a reference object is expanded in-line there is a button to jump straight to an Edit view of that object see https://nakedobjects.codeplex.com/workitem/1...VeraCrypt: VeraCrypt version 1.0b: Changes since version 1.0a :Enhance RIPEMD160 implementation in BootLoaded by using the compiler uint32 type Don't position legacy flag in volume header for newer VeraCrypt releasesPlayer Framework by Microsoft: Player Framework for Windows 8 and WP8 (v1.3 beta): Preview: New MPEG DASH adaptive streaming plugin for WAMS. Preview: New Ultraviolet CFF plugin. Preview: New WP7 version with WP8 compatibility. (source code only) Source code is now available via CodePlex Git Misc bug fixes and improvements: WP8 only: Added optional fullscreen and mute buttons to default xaml JS only: protecting currentTime from returning infinity. Some videos would cause currentTime to be infinity which could cause errors in plugins expecting only finite values. (...SSIS DQS Matching Transformation: SSIS DQS Matching Transformation 1.0: Initial release of the SSIS DQS Matching Component.AssaultCube Reloaded: 2.5.8: SERVER OWNERS: note that the default maprot has changed once again. Linux has Ubuntu 11.10 32-bit precompiled binaries and Ubuntu 10.10 64-bit precompiled binaries, but you can compile your own as it also contains the source. If you are using Mac or other operating systems, please wait while we continue to try to package for those OSes. Or better yet, try to compile it. If it fails, download a virtual machine. The server pack is ready for both Windows and Linux, but you might need to compi...Compare .NET Objects: Version 1.7.2.0: If you like it, please rate it. :) Performance Improvements Fix for deleted row in a data table Added ability to ignore the collection order Fix for Ignoring by AttributesMicrosoft Ajax Minifier: Microsoft Ajax Minifier 4.95: update parser to allow for CSS3 calc( function to nest. add recognition of -pponly (Preprocess-Only) switch in AjaxMinManifestTask build task. Fix crashing bug in EXE when processing a manifest file using the -xml switch and an error message needs to be displayed (like a missing input file). Create separate Clean and Bundle build tasks for working with manifest files (AjaxMinManifestCleanTask and AjaxMinBundleTask). Removed the IsCleanOperation from AjaxMinManifestTask -- use AjaxMinMan...VG-Ripper & PG-Ripper: VG-Ripper 2.9.44: changes NEW: Added Support for "ImgChili.net" links FIXED: Auto UpdaterDocument.Editor: 2013.25: What's new for Document.Editor 2013.25: Improved Spell Check support Improved User Interface Minor Bug Fix's, improvements and speed upsStyleMVVM: 3.0.2: This is a minor feature and bug fix release Features: ExportWhenDebuggerIsAttacedAttribute - new attribute that marks an attribute to only be exported when the debugger is attahced InjectedFilterAttributeFilterProvider - new Attribute Filter provider for MVC that injects the attributes Performance Improvements - minor speed improvements all over, and Import collections is now 50% faster Bug Fixes: Open Generic Constraints are now respected when finding exports Fix for fluent registrat...WPF Composites: Version 4.3.0: In this Beta release, I broke my code out into two separate projects. There is a core FasterWPF.dll with the minimal required functionality. This can run with only the Aero.dll and the Rx .dll's. Then, I have a FasterWPFExtras .dll that requires and supports the Extended WPF Toolkit™ Community Edition V 1.9.0 (including Xceed DataGrid) and the Thriple .dll. This is for developers who want more . . . Finally, you may notice the other OPTIONAL .dll's available in the download such as the Dyn...Channel9's Absolute Beginner Series: Windows Phone 8: Entire source code for the Channel 9 series, Windows Phone 8 Development for Absolute Beginners.Indent Guides for Visual Studio: Indent Guides v13: ImportantThis release does not support Visual Studio 2010. The latest stable release for VS 2010 is v12.1. Version History Changed in v13 Added page width guide lines Added guide highlighting options Fixed guides appearing over collapsed blocks Fixed guides not appearing in newly opened files Fixed some potential crashes Fixed lines going through pragma statements Various updates for VS 2012 and VS 2013 Removed VS 2010 support Changed in v12.1: Fixed crash when unable to start...Fluent Ribbon Control Suite: Fluent Ribbon Control Suite 2.1.0 - Prerelease d: Fluent Ribbon Control Suite 2.1.0 - Prerelease d(supports .NET 3.5, 4.0 and 4.5) Includes: Fluent.dll (with .pdb and .xml) Showcase Application Samples (not for .NET 3.5) Foundation (Tabs, Groups, Contextual Tabs, Quick Access Toolbar, Backstage) Resizing (ribbon reducing & enlarging principles) Galleries (Gallery in ContextMenu, InRibbonGallery) MVVM (shows how to use this library with Model-View-ViewModel pattern) KeyTips ScreenTips Toolbars ColorGallery *Walkthrough (do...Magick.NET: Magick.NET 6.8.5.1001: Magick.NET compiled against ImageMagick 6.8.5.10. Breaking changes: - MagickNET.Initialize has been made obsolete because the ImageMagick files in the directory are no longer necessary. - MagickGeometry is no longer IDisposable. - Renamed dll's so they include the platform name. - Image profiles can now only be accessed and modified with ImageProfile classes. - Renamed DrawableBase to Drawable. - Removed Args part of PathArc/PathCurvetoArgs/PathQuadraticCurvetoArgs classes. The...Keyboard Image Viewer: 1.5.4: Upgraded folder picker dialog to better version on Win7+ Fixed bug that stopped slideshow from looping back to the start of the list. Added crash dialog that allows you to see and copy exception stack traces for fixing.DependencyAnalysis (Egg and Gherkin): 0.9.4: - Create Visual Studio 2012 Addin for ad-hoc analysis of your project - Display metrics in a grid - Adequate performing serialization between Addin (Visual Studio process) and AnalysisHost process - Display dependencies as graph (proximity graph) - Create a logo for the project - Constructors of anonymous types no longer hide constructors of the declaring type during "build dependencies" phase - Type descriptors were added multiple times to SubmoduleDescriptor. Types collection, same instanc...Bloomberg API Emulator: Bloomberg API Emulator (v 1.0.5): This version contains the full Java port of my original C# code. I just finished the MarketDataSubscription request type. I will start working on a C++ port of my C# code.Three-Dimensional Maneuver Gear for Minecraft: TDMG 1.1.0.0 for 1.5.2: CodePlex???(????????) ?????????(???1/4) ??????????? ?????????? ???????????(??????????) ??????????????????????? ↑????、?????????????????????(???????) ???、??????????、?????????????????????、????????1.5?????????? Shift+W(????)??????????????????10°、?10°(?????????)???Hyper-V Management Pack Extensions 2012: HyperVMPE2012 (v1.0.1.126): Hyper-V Management Pack Extensions 2012 Beta ReleaseNew Projects.Net Encryption App: This is a C#.Net desktop application that will let users encrypt and de-crypt files with the algorithm of their choosing.AutoSPDocumenter: AutoSPDocumenter utilises PowerShell to document SharePoint farms and provide output in usable formats.Azure Storage Redirector: Azure Storage Redirector. Redirects requests to Global Azure Storage to China Azure Storage. ?????Azure Storage?request??????Azure storagebrownbag: Simple project to show branching, merging, and shelvingChannel9's Absolute Beginner Series: Channel 9's absolute beginner series source code. From Windows Phone 7, Windows Phone 8, Windows Store applications, one stop area for all the seriesFAST for Sharepoint 2010 Query Tool (.NET 3.5): .NET 3.5 version of the FAST Search for SharePoint MOSS 2010 Query Tool (https://fastforsharepoint.codeplex.com/). For environments without .NET 4.0HAOest Framework: HAOest????ListManager: ListManager????? by HADB of HAOestMyPS: mypsNNRel: NNRelO - BV - 2: TestOpen XML SDK for JavaScript: Small JavaScript library that enables you to implement Open XML functionality anywhere you can use JavaScript.Orchard Prefix free: Provides a script manifest for the Prefix free script libraries.PVDesktop: PVDesktop is an application for designing and analyzing specific solar energy sites.Red the sound TowePlay: School project at ISEN Lille. Creation of a collaborative music creation software in C# using.NETScience Kits for Kids: This project is the service for Childhood Education which aged 4 - 8.SharePoint ULS for PowerShell: Allows PowerShell to log to the SharePoint ULS.SSIS DQS Matching Transformation: The SSIS DQS Matching Transformation uses Data Quality Services (DQS) to find duplicate data within the SSIS data flow.TARVOS Computer Networks Simulator: Discrete event-based network simulator, supports simulating MPLS architecture, several RSVP-TE protocol functionalities and fast recovery.Web API Explorer 4 DNN: Web API Explorer for DNN(R) aids module development allowing you to examine the Routing Table entries for a DotNetNuke(R) portal.

    Read the article

  • Acceptable placement of the composition root using dependency injection and inversion of control containers

    - by Lumirris
    I've read in several sources including Mark Seemann's 'Ploeh' blog about how the appropriate placement of the composition root of an IoC container is as close as possible to the entry point of an application. In the .NET world, these applications seem to be commonly thought of as Web projects, WPF projects, console applications, things with a typical UI (read: not library projects). Is it really going against this sage advice to place the composition root at the entry point of a library project, when it represents the logical entry point of a group of library projects, and the client of a project group such as this is someone else's work, whose author can't or won't add the composition root to their project (a UI project or yet another library project, even)? I'm familiar with Ninject as an IoC container implementation, but I imagine many others work the same way in that they can scan for a module containing all the necessary binding configurations. This means I could put a binding module in its own library project to compile with my main library project's output, and if the client wanted to change the configuration (an unlikely scenario in my case), they could drop in a replacement dll to replace the library with the binding module. This seems to avoid the most common clients having to deal with dependency injection and composition roots at all, and would make for the cleanest API for the library project group. Yet this seems to fly in the face of conventional wisdom on the issue. Is it just that most of the advice out there makes the assumption that the developer has some coordination with the development of the UI project(s) as well, rather than my case, in which I'm just developing libraries for others to use?

    Read the article

  • MVC Scaffold Template Not Generating?

    - by monkey9987
    Been working on an MVC project and my templates were not generating. I first created my Model inside my "Models" folder, then did a quick compile. Next I went to the Views folder to get it created, right click and say "Add View" then I clicked the checkbox to create an edit page. What happened was the template would never seem to pull in my Model, it would just have the default header items, but the entire model was missing.  My model was defined as follows: public class LogOnModel { [Required][Display(Name = "User name")]public string UserName;[Required][DataType(DataType.Password)][Display(Name = "Password")]public string Password;[Display(Name = "Remember me?")]public bool RememberMe; } See anything wrong with that? I couldn't figure out why each time I created my View and selected the option to create the "Edit" scaffold automatically, it would come up blank. Turns out I'm missing my get / set methods on the Model class items. Here's my code with the correct setup: public class LogOnModel { [Required][Display(Name = "User name")]public string UserName { get; set; } [Required][DataType(DataType.Password)][Display(Name = "Password")]public string Password { get; set; }[Display(Name = "Remember me?")]public bool RememberMe { get; set; } }  I hope that helps someone out, it's pretty simple when I look at it now, but that's always the case!  ~ Steve

    Read the article

< Previous Page | 119 120 121 122 123 124 125 126 127 128 129 130  | Next Page >