Search Results

Search found 11837 results on 474 pages for 'third party libraries'.

Page 12/474 | < Previous Page | 8 9 10 11 12 13 14 15 16 17 18 19  | Next Page >

  • How can I move a library inside a project's source tree and compiling static binaries?

    - by AbrahamVanHelpsing
    How can I move a library inside a project's source tree and compiling static binaries? I want to use a particular tool that utilizes ANCIENT binaries without upgrading it's API usage. This way I can use the old libraries inside the single binary without wrecking the local host environment. I am on nix with netbeans/eclipse/code::blocks. I don't have a problem reading, just looking for a starting point. Any thoughts?

    Read the article

  • dun goofed the files in /usr/lib/x86_64-linux-gnu/

    - by tipu
    there was some weird package issue with (in my limited understanding) 32/64 bit libraries, so i went around making symlinks to the file my lamp installation expected to the ones that actually existed. i did this for a # of files in here: /usr/lib/x86_64-linux-gnu/ however php still ended up not working (separate issue) and now i believe i have a screwed up lib directory. is there a way to revert those library files?

    Read the article

  • Are there software options (preferabbly .NET) for doing distance and speed analysis of footballers moving on video?

    - by Anonymous Type
    Editing Question for Clarity Thanks for feedback so far, very insightful. I'm not sure how far along this part of the software community is, and what if any libraries exist for me to leverage from. Heres what I'm trying to do. Problem: Take an existing video of a game of rugby league. The Rugby League field is 100 metres long, 70 metres wide, and has white line markings every 10 metres running along the width of the field, as well as along the sidelines. Each side has 13 players on the field. Players on each team have identical jerseys that normally constrast strongly against background colours (green/brown field colour) and the referee's colour (usually yellow) and the designated water runner (orange). All players have a unique number in thick white lettering on their backs for identification. Video is taken with a high definition camera. Currently only one camera is used (2D) and existing video does not contain a foreground object of fixed spatial dimensions (as suggested in one answer for comparision measurements, however I could add this to future filming sessions if it is worthwhile). The player's do not run in a straight line 50% of the time but will go sideways on on a diagonal to the play the ball. The distance measured always starts from the spot of the previous "tackle", which ends where the player stops forward movement. It is not always possible to determine the players number from the video (facing other direction, sunlight, others standing in the way of the camera). But this isn't important as the software could allow for manual inputting of unknown "runs" at a later point after analysis. Determine the distance between two points (i.e. where the player started his "run" and where he finished it). I'm guessing that this would be quite doable if I manually marked the start and end point in the video. But how would I use landmarks in the background to determine the distance (assuming the person taking the video has kept it from jerking around). Question: Do software packages or libraries exist that are specialised enough to assist with writing analysis software to determine a sports persons distance travelled based on video taken of the performance?

    Read the article

  • Existent js libs for tileset / map loading and rendering?

    - by ylluminate
    I'm building an rts style overhead tileset game with JavaScript (particularly using Ember.js framework as a base). The map is so large that I'd very much like to be able to load and render the board and layered items in a Google Maps'esque. I'm curious as to whether there are existing libs that would be helpful and already well thought out in these regards vs trying to reinvent the wheel. Are there any such libraries or code examples that would be useful in this area of board / map management?

    Read the article

  • Third monitor randomly defaults to 640 x 480 on startup

    - by andybaird
    I purchased a PowerColor HD 5770 video card so I could get a third monitor working. I have two widescreen monitors, one attached via HDMI and the other attached via DVI. My third monitor is attached from DisplayPort to VGA (using a passive connector like this one) The third monitor is a 19" Hyundai L90D. I was unable to find any Windows 7 (or Vista for that matter) drivers for the monitor, so it's stuck with "Generic Non-PNP Monitor". It's native resolution is 1280 x 1024. Randomly Windows will boot up in the correct native res, but sometimes it boots up in 640 x 480 res. When it does boot up into 640 x 480 res, the screen resolution slider is stuck and I cannot slide it back to 1280 x 1024. I cannot find a pattern for when or why it will do this, I've tried rebooting five or six times in a row at times to get it to boot into native res, but this doesn't always work.

    Read the article

  • Third monitor randomly defaults to 640 x 480 on startup

    - by ajbdev
    I purchased a PowerColor HD 5770 video card so I could get a third monitor working. I have two widescreen monitors, one attached via HDMI and the other attached via DVI. My third monitor is attached from DisplayPort to VGA (using a passive connector like this one) The third monitor is a 19" Hyundai L90D. I was unable to find any Windows 7 (or Vista for that matter) drivers for the monitor, so it's stuck with "Generic Non-PNP Monitor". It's native resolution is 1280 x 1024. Randomly Windows will boot up in the correct native res, but sometimes it boots up in 640 x 480 res. When it does boot up into 640 x 480 res, the screen resolution slider is stuck and I cannot slide it back to 1280 x 1024. I cannot find a pattern for when or why it will do this, I've tried rebooting five or six times in a row at times to get it to boot into native res, but this doesn't always work.

    Read the article

  • Best jQuery Libraries, Plug-Ins and Controls

    - by schnieds
    Worried About The Loss Of ASP.NET Controls in MVC? Don’t BeIf you are hesitant of moving to ASP.NET MVC because you are worried about losing all of the awesome ASP.NET controls that you are so used to using, don’t be. Wonderful client side controls already exist to replace most, if not all, of the most used ASP.NET controls (and these controls provide a MUCH BETTER user experience.) Here is a list of my favorite jQuery plug-ins and libraries that make user interface development so much easier... [Read More Here]Aaron Schniederhttp://www.churchofficeonline.com

    Read the article

  • What are good JS libraries for game dev?

    - by acidzombie24
    If I decide to write a simple game both text and graphical (2d) what libraries would I use? (Assume we are using a HTML5 compatible browser) The main things I can think of Rendering text on screen Animating sprites (using images/css) Input (capturing the arrow keys and getting relative mouse positions) Perhaps some preloading resource or dynamically loading resources and choosing order Sound (but I am unsure how important this will be to me at first). Perhaps with mixing and chaining sounds or looping forever until stop. Networking (low priority) to connect a user to another or to continuously GET data without multiple request (I know this exist but I don't know how easy it is to setup or use. But this isn't important to me. Its for the question).

    Read the article

  • ASP.NET: Including JavaScript libraries conditionally from CDN

    - by DigiMortal
    When developing cloud applications it is still useful to build them so they can run also on local machine without network connection. One thing you use from CDN when in cloud and from app folder when not connected are common JavaScript libraries. In this posting I will show you how to add support for local and CDN script stores to your ASP.NET MVC web application. Our solution is simple. We will add new configuration setting to our web.config file (including cloud transform file of it) and new property to our web application. In master page where scripts are included we will include scripts from CDN conditionally. There is nothing complex, all changes we make are simple ones. 1. Adding new property to web application Although I am using ASP.NET MVC web application these modifications work also very well with ASP.NET Forms. Open Global.asax and add new static property to your application class. public static bool UseCdn {     get     {         var valueString = ConfigurationManager.AppSettings["useCdn"];         bool useCdn;           bool.TryParse(valueString, out useCdn);         return useCdn;     } } If you want less round-trips to configuration data you can keep some nullable boolean in your application class scope and load CDN setting first time it is asked. 2. Adding new configuration setting to web.config By default my application uses local scripts. Although my application runs on cloud I can do a lot of stuff without staging environment in cloud. So by default I don’t have costs on traffic when including scripts from application folders. <appSettings>   <add key="UseCdn" value="false" /> </appSettings> You can also set UseCdn value to true and change it to false when you are not connected to network. 3. Modifying web.config cloud transform I have special configuration for my solution that I use when deploying my web application to cloud. This configuration is called Cloud and transform for this configuration is located in web.cloud.config. To make application using CDN when deployed to cloud we need the following transform. <appSettings>   <add key="UseCdn"        value="true"        xdt:Transform="SetAttributes"        xdt:Locator="Match(key)" /> </appSettings> Now when you publish your application to cloud it uses CDN by default. 4. Including scripts in master pages The last thing we need to change is our master page. My solution is simple. I check if I have to include scripts from CDN and if it is true then I include scripts from there. Otherwise my scripts will be included from application folder. @if (MyWeb.MvcApplication.UseCdn) {     <script src="http://ajax.microsoft.com/ajax/jquery/jquery-1.4.4.min.js" type="text/javascript"></script> } else {     <script src="@Url.Content("~/Scripts/jquery-1.4.4.min.js")" type="text/javascript"></script> } Although here is only one script shown you can add all your scripts that are also available in some CDN in this if-else block. You are free to include scripts from different CDN services if you need. Conclusion As we saw it was very easy to modify our application to make it use CDN for JavaScript libraries in cloud and local scripts when run on local machine. We made only small changes to our application code, configuration and master pages to get different script sources supported. Our application is now more independent from external sources when we are working on it.

    Read the article

  • windows 7 Libraries corruption

    - by Pablo
    I've provided an image to show the issue that im currently having. So basically the libraries work fine on windows explorer works fine, but trying to access from a browser, to upload file for example, the libraries just don't work they seem to be empty. Looking at the image, the window on the left is from the chrome browser and the windows on the right is from windows explorer. Image Link *Can't post images yet Any ideas? i've also tried creating new libraries with no luck.

    Read the article

  • Windows 7 inbuilt and 3rd party (de)fragmentation related queries

    - by Karan
    I have a pretty good idea of how files end up getting fragmented. That said, I just copied ~3,200 files of varying sizes (from a few KB to ~20GB) from an external USB HDD to an internal, freshly formatted (under Windows 7 x64), NTFS, 2TB, 5400RPM, WD, SATA, non-system (i.e. secondary) drive, filling it up 57%. Since it should have been very much possible for each file to have been stored in one contiguous block, I expected the drive to be fragmented not more than 1-2% at most after this rather lengthy exercise (unfortunately this older machine doesn't support USB 3.0). Windows 7's inbuilt defrag utility told me after a quick analysis that the drive was fragmented only 1% or so, which dovetailed neatly with my expectations. However, just out of curiosity I downloaded and ran the latest portable x64 version of Piriform's Defraggler, and was shocked to see the drive being reported as being ~85% fragmented! The portable version of Auslogics Disk Defrag also agreed with Defraggler, and both clearly expected to grind away for ~10 hours to completely defragment the drive. 1) How in blazes could the inbuilt and 3rd party defrag utils disagree so badly? I mean, 10-20% variance is probably understandable, but 1% and 85% are miles apart! This Engineering Windows 7 blog post states: In Windows XP, any file that is split into more than one piece is considered fragmented. Not so in Windows Vista if the fragments are large enough – the defragmentation algorithm was changed (from Windows XP) to ignore pieces of a file that are larger than 64MB. As a result, defrag in XP and defrag in Vista will report different amounts of fragmentation on a volume. ... [Please read the entire post so the quote is not taken out of context.] Could it simply be that the 3rd party defrag utils ignore this post-XP change and continue to use analysis algos similar to those XP used? 2) Assuming that the 3rd party utils aren't lying about the real extent of fragmentation (which Windows is downplaying post-XP), how could the files have even got fragmented so badly given they were just copied over afresh to an empty drive? 3) If vastly differing analysis algos explain the yawning gap, which do I believe? I'm no defrag fanatic for sure, but 85% is enough to make me seriously consider spending 10 hours defragging this drive. On the other hand, 1% reported by Windows' own defragger clearly implies that there is no cause for concern and defragging would actually have negative consequences (as per the post). Is Windows' assumption valid and should I just let it be, or will there be any noticeable performance gains after running one of the 3rd party utils for 10 hours straight? 4) I see that out of the box Windows 7 defrag is scheduled to run weekly. Does anyone know whether it defrags every single time, or only if its analysis reveals a fragmentation percentage over a set threshold? If the latter, what is this threshold and can it be changed, maybe via a Registry edit? Thanks for reading through (my first query on this wonderful site!) and for any helpful replies. Also, if you're answering question #3, please keep in mind that any speed increases post defragging with 3rd party utils vis-à-vis Windows' inbuilt program should not include pre-Vista (preferably pre-Win7) examples. Further, examples of programs that made your system boot faster won't help in this case, since this is a non-system drive (although one that'll still be used daily).

    Read the article

  • Windows 7 - All Icons Missing, Explorer Progress Bar Never Finishes, Libraries Gone

    - by Alex
    since yesterday i've had three issues which all arose at the same time. windows 7 x64, i7 2.8ghz 12gb ddr3 1 - my libraries, favorites, drives are missing...basically the entire sidebar is gone. http://i.imgur.com/m8pRQ.png. yet when i open a dialog, my libraries and drives are back to normal ONLY for the dialog. i tried Restore Default Libraries. it works one time, but when i open libraries again i go back to the empty mess. restarting the computer temporarily fixes the problem. 2 - in the explorer window that's showing libraries, when i navigate to a certain folder i get an unending progress bar (the kind that turns the address bar green). yesterday when the problem started, i was saving a file to this folder. the program writing the file crashed during the write and i believe that's what caused the problem. i have sugarsync backing up that folder and when i restarted the computer sugarsync informed me that its database was corrupted, so i had to uninstall and reinstall the software. 3 - icons are missing. the Rebuild Icon Cache did not fix this. http://i.imgur.com/r9pgo.png restarting the computer temporarily fixes these problems, but when i open the directory with the initial write problem, everything stops working. edit: i should note that i did a chkdsk /f and it repaired problems. i also did the thing that verifies then restores windows files (can't remember the command now), which reported that everything was normal.

    Read the article

  • Remove Libraries from Windows 8 Explorer sidebar

    - by FiveO
    I had removed the Libraries from my Windows 7 with this registry tweaks, but since the update to Windows 8 the Liebraries are back in my Windows Explorer. So I tried to tweak the registry again, but it fails to get permission to change the value (in Windows 7 it worked). http://www.askvg.com/how-to-remove-libraries-from-windows-7-explorers-navigation-pane/ Someone know how to remove the Libraries folder or to get the permission to change the value? Here it fails to get the permission:

    Read the article

  • Typical practice for redistributing third party source code with your source code

    - by bglenn
    I'm releasing an application I wrote as an open-source project by creating a public source-code repository. I use a third-party library which is also open-source and freely redistributable. I'm not versioning the third-party library, but should I include it in my repository for the convenience of those cloning the repository or should I expect them to download the third-party library on their own? To be clear, I'm not asking if I should version the third-party code or if I can redistribute it, but whether it is standard practice to include third-party source code as a convenience.

    Read the article

  • Recognizing synchronization object hanging two 3rd party executables

    - by eran
    I'm using a 3rd party tool, which uses a 4th party plugin. Occasionally, the tool will hang when launched. Looking at the stack traces, I can see a few threads are waiting on WaitForSingleObject, and my bet is that they're blocking each other. Some of the threads start at the 3rt party tool, and some at the 4th party plugin. What I'd like to do is file the most detailed complaint to the 3rd party tool vendor, assuming it's its fault (I don't trust their local support to get those details themselves). For that, I'd like to: Find out what are the synchronization objects currently waited on Find out who has created those synchronization objects Tools currently at hand are VS2005, WinDbg and Process Explorer. OS is Window 7 64 bit. Any suggestions?

    Read the article

  • Setting up separate ctags db's for C/C++ standard libs, boost, and third party libs

    - by Robert S. Barnes
    I want to set up separate ctags databases for various libraries in /usr/include/ for use with OmniCppComplete. The idea is to be able to pull in only the libraries needed for a particular project in the target language - C or C++. For example, I'd like to have one database for the standard C libraries, one for system libraries that might be used by either C or C++ programs ( sockets / networking comes to mind ) one for the standard C++ libs / STL / Boost, and then other databases for various third party libraries such as QT or glib. Then I could pull something in simply by typing set tags+= ~/.vim/somelib.tags in vim. I assume that everything related to the C++ stdlib and STL are in the /usr/include/c++ and that Boost is all in /usr/include/boost. Unfortunately it seems that the standard C libs and system libs are just kind of dumped directly into /usr/include/ with a variety of other stuff. How can I get a list of which files and directories belong to which libs? I'm on Ubuntu 8.04.

    Read the article

  • Portable class libraries and fetching JSON

    - by Jeff
    After much delay, we finally have the Windows Phone 8 SDK to go along with the Windows 8 Store SDK, or whatever ridiculous name they’re giving it these days. (Seriously… that no one could come up with a suitable replacement for “metro” is disappointing in an otherwise exciting set of product launches.) One of the neat-o things is the potential for code reuse, particularly across Windows 8 and Windows Phone 8 apps. This is accomplished in part with portable class libraries, which allow you to share code between different types of projects. With some other techniques and quasi-hacks, you can share some amount of code, and I saw it mentioned in one of the Build videos that they’re seeing as much as 70% code reuse. Not bad. However, I’ve already hit a super annoying snag. It appears that the HttpClient class, with its idiot-proof async goodness, is not included in the Windows Phone 8 class libraries. Shock, gasp, horror, disappointment, etc. The delay in releasing it already caused dismay among developers, and I’m sure this won’t help. So I started refactoring some code I already had for a Windows 8 Store app (ugh) to accommodate the use of HttpWebRequest instead. I haven’t tried it in a Windows Phone 8 project beyond compiling, but it appears to work. I used this StackOverflow answer as a starting point since it’s been a long time since I used HttpWebRequest, and keep in mind that it has no exception handling. It needs refinement. The goal here is to new up the client, and call a method that returns some deserialized JSON objects from the Intertubes. Adding facilities for headers or cookies is probably a good next step. You need to use NuGet for a Json.NET reference. So here’s the start: using System.Net; using System.Threading.Tasks; using Newtonsoft.Json; using System.IO; namespace MahProject {     public class ServiceClient<T> where T : class     {         public ServiceClient(string url)         {             _url = url;         }         private readonly string _url;         public async Task<T> GetResult()         {             var response = await MakeAsyncRequest(_url);             var result = JsonConvert.DeserializeObject<T>(response);             return result;         }         public static Task<string> MakeAsyncRequest(string url)         {             var request = (HttpWebRequest)WebRequest.Create(url);             request.ContentType = "application/json";             Task<WebResponse> task = Task.Factory.FromAsync(                 request.BeginGetResponse,                 asyncResult => request.EndGetResponse(asyncResult),                 null);             return task.ContinueWith(t => ReadStreamFromResponse(t.Result));         }         private static string ReadStreamFromResponse(WebResponse response)         {             using (var responseStream = response.GetResponseStream())                 using (var reader = new StreamReader(responseStream))                 {                     var content = reader.ReadToEnd();                     return content;                 }         }     } } Calling it in some kind of repository class may look like this, if you wanted to return an array of Park objects (Park model class omitted because it doesn’t matter): public class ParkRepo {     public async Task<Park[]> GetAllParks()     {         var client = new ServiceClient<Park[]>(http://superfoo/endpoint);         return await client.GetResult();     } } And then from inside your WP8 or W8S app (see what I did there?), when you load state or do some kind of UI event handler (making sure the method uses the async keyword): var parkRepo = new ParkRepo(); var results = await parkRepo.GetAllParks(); // bind results to some UI or observable collection or something Hopefully this saves you a little time.

    Read the article

  • Extreme Optimization Numerical Libraries for .NET – Part 1 of n

    - by JoshReuben
    While many of my colleagues are fascinated in constructing the ultimate ViewModel or ServiceBus, I feel that this kind of plumbing code is re-invented far too many times – at some point in the near future, it will be out of the box standard infra. How many times have you been to a customer site and built a different variation of the same kind of code frameworks? How many times can you abstract Prism or reliable and discoverable WCF communication? As the bar is raised for whats bundled with the framework and more tasks become declarative, automated and configurable, Information Systems will expose a higher level of abstraction, forcing software engineers to focus on more advanced computer science and algorithmic tasks. I've spent the better half of the past decade building skills in .NET and expanding my mathematical horizons by working through the Schaums guides. In this series I am going to examine how these skillsets come together in the implementation provided by ExtremeOptimization. Download the trial version here: http://www.extremeoptimization.com/downloads.aspx Overview The library implements a set of algorithms for: linear algebra, complex numbers, numerical integration and differentiation, solving equations, optimization, random numbers, regression, ANOVA, statistical distributions, hypothesis tests. EONumLib combines three libraries in one - organized in a consistent namespace hierarchy. Mathematics Library - Extreme.Mathematics namespace Vector and Matrix Library - Extreme.Mathematics.LinearAlgebra namespace Statistics Library - Extreme.Statistics namespace System Requirements -.NET framework 4.0  Mathematics Library The classes are organized into the following namespace hierarchy: Extreme.Mathematics – common data types, exception types, and delegates. Extreme.Mathematics.Calculus - numerical integration and differentiation of functions. Extreme.Mathematics.Curves - points, lines and curves, including polynomials and Chebyshev approximations. curve fitting and interpolation. Extreme.Mathematics.Generic - generic arithmetic & linear algebra. Extreme.Mathematics.EquationSolvers - root finding algorithms. Extreme.Mathematics.LinearAlgebra - vectors , matrices , matrix decompositions, solvers for simultaneous linear equations and least squares. Extreme.Mathematics.Optimization – multi-d function optimization + linear programming. Extreme.Mathematics.SignalProcessing - one and two-dimensional discrete Fourier transforms. Extreme.Mathematics.SpecialFunctions

    Read the article

  • Application toolkits like QT versus traditional game/multimedia libraries like SFML

    - by Aaron
    I currently intend to use SFML for my next game project. I'll need a substantial GUI though (RPG/strategy-type) so I'll either have to implement my own or try to find an appropriate third party library, which seem to boil down to CEGUI, libRocket, and GWEN. At the same time, I do not anticipate doing that many advanced graphical effects. My game will be 2D and primarily sprite-based with some sprite animations. I've recently discovered that QT applications can have their appearance styled so that they don't have to look like plain OS apps. Given that, I am beginning to consider QT a valid alternative to SFML. I wouldn't have to implement the GUI functionality I'd need, and I may not be taking advantage of SFML's lower-level access anyway. The only drawbacks I can think of immediately are the learning curve for QT and figuring out how to fit game logic inside such a framework after getting used to the input/update/render loop of traditional game libraries. When would an application toolkit like QT be more appropriate for a game than a traditional game or multimedia library like SFML?

    Read the article

  • Building a titanium module with xcode that uses other libraries

    - by kudorgyozo
    I have a Titanium module and i want to use it for voice over ip using pjsip. I have changed the project settings the following way: added to the other linker flags the libraries from pjsip added to the header search paths the headers from pjsip added to the library search paths the libraries from pjsip If i do these things for a normap iPhone app it works i can make calls i have tested it and made a wrapper class that has methods like makeCall, hangup. etc. But i want to use this class together with the libraries from pjsip in a Titanium module. It gives me errors like: implicit declaration of function 'pjsua_perror' implicit declaration of function 'pjsua_destroy' 'pjsua_config' undeclared (first use in this function) These are all part of pjsip (pjsua_perror and pjsua_destroy are functions and pjsua_config is a struct) Does it work this way? Can i include other libraries in a library? What is the difference between making an app that uses libraries and making a library that uses libraries?

    Read the article

  • How to remove the explicit dependencies to other projects' libraries in Eclipse launch configuration

    - by euluis
    In Eclipse it is possible to create launch configurations in a project, specifying the runtime dependencies from another project. A problem I found was that if you have a multiple project workspace, being possible that each project has its own libraries, it is easy to add explicit dependencies in a secondary project to libraries that are of another project and therefore subject to change. An example of this problem follows: proj1 +-- src +-- lib +-- jar1-v1.0.jar +-- jar2-v1.0.jar proj2 +-- src +-- proj2-tests.launch I don't have a dependency from the code in proj2/src to the libraries in proj1/lib. Nevertheless, I do have a dependency from proj2/src to proj1/src, although since there is an internal dependency in the code in proj1/src to its libraries jar1-v1.0.jar and jar2.v1.0.jar, I have to add a dependency in proj2-tests.launch to the libraries in proj1/lib. This translates to the following ugly lines in proj2-tests.launch: <listEntry value="<?xml version="1.0" encoding="UTF-8" standalone="no"?> <runtimeClasspathEntry path="3" projectName="proj1" type="1"/> "/> <listEntry value="<?xml version="1.0" encoding="UTF-8" standalone="no"?> <runtimeClasspathEntry internalArchive="/proj1/lib/jar1-v1.0.jar" path="3" type="2"/> "/> <listEntry value="<?xml version="1.0" encoding="UTF-8" standalone="no"?> <runtimeClasspathEntry internalArchive="/proj1/lib/jar2-v1.0.jar" path="3" type="2"/> "/> This wouldn't be a big problem if there wasn't the need from time to time to evolve the software, upgrade the libraries and etc. Consider the common need to upgrade the libraries jar1-v1.0.jar and jar2-v1.0.jar to their versions v1.1. Consider that you have about 10 projects in one workspace, having about 5 libraries each and about 4 launch configurations. You get a maintenance overhead in doing a simple upgrade of a library, which normally must imply changes in files for which there wasn't the need for. Or maybe I'm doing something wrong... What I would like to state is proj2 depends on proj1 and on its libraries and having this translated to simply that in the *.launch files. Is that possible?

    Read the article

  • Meeting attendees missing for organizer after 3rd party accepts meeting

    - by jonath2002
    Outlook 2007, Exchange 2003 EE Meeting Organizer created a meeting. (Some attendees part of domain some not) Organizer updated meeting with additional attendees and changed time. (Some attendees part of domain some not) One of the attendees forwarded meeting to someone not on attendee list , that person (not in domain) accepted meeting. Organizer only sees himself and 3rd party person in Attendee list. Troubleshooting: Attendees that have accepted meeting show all attendees in list (excluding 3rd party) Attendee re-accepted invite and gets added to organizer list

    Read the article

  • Tool for Network game party

    - by nXqd
    I'm looking for tools for a network game party ( LAN ) .This is my first time I join as a tech-support for this party. I already know some tools like Desktop Sharing, Desktop lock . Anyone suggest more tools ? And links are appreciated :)

    Read the article

  • How to encapsulate a third party complex object structure?

    - by tangens
    Motivation Currently I'm using the java parser japa to create an abstract syntax tree (AST) of a java file. With this AST I'm doing some code generation (e.g.: if there's an annotation on a method, create some other source files, ...) Problem When my code generation becomes more complex, I've to dive deeper into the structure of the AST (e.g. I have to use visitors to extract some type information of method parameters). But I'm not sure if I want to stay with japa or if I will change the parser library later. Because my code generator uses freemarker (which isn't good at automatic refactoring) I want the interface that it uses to access the AST information to be stable, even if I decide to change the java parser. Question What's the best way to encapsulate complex datastructures of third party libraries? I could create my own datatypes and copy the parts of the AST that I need into these. I could create lots of specialized access methods that work with the AST and create exactly the infos I need (e.g. the fully qualified return type of a method as one string, or the first template parameter of a class). I could create wrapper classes for the japa datastructures I currently need and embed the japa types inside, so that I can delegate requests to the japa types and transform the resulting japa types to my wrapper classes again. Which solution should I take? Are there other (better) solutions to this problem?

    Read the article

  • C# graph library to be used from Unity3D

    - by Heisenbug
    I'm looking for a C# graph library to be used inside Unity3D script. I'm not looking for pathfinding libraries (I know there are good one available). I could consider using a path finding library only if it gives me direct access to underlying graph classes (I need nodes and edges, and classic graph algorithms) The only product I've seen that seems intersting is QuickGraph. I have the following question: Is it possible to use QuickGraph inside Unity3d? If yes. Is this a good idea? Does it have any drawbacks? Is it a quite fast and well written/supported library? Does anyone has ever used it? Are available other C# graph library that can be easily integrated in Unity3d?

    Read the article

< Previous Page | 8 9 10 11 12 13 14 15 16 17 18 19  | Next Page >