Search Results

Search found 4099 results on 164 pages for 'party'.

Page 21/164 | < Previous Page | 17 18 19 20 21 22 23 24 25 26 27 28  | Next Page >

  • Can you notice what's wrong with my PHP or MYSQL code?

    - by Jenna
    I am trying to create a category menu with sub categories. I have the following MySQL table: -- -- Table structure for table `categories` -- CREATE TABLE IF NOT EXISTS `categories` ( `ID` int(11) NOT NULL AUTO_INCREMENT, `name` varchar(1000) NOT NULL, `slug` varchar(1000) NOT NULL, `parent` int(11) NOT NULL, `type` varchar(255) NOT NULL, PRIMARY KEY (`ID`) ) ENGINE=MyISAM DEFAULT CHARSET=latin1 AUTO_INCREMENT=66 ; -- -- Dumping data for table `categories` -- INSERT INTO `categories` (`ID`, `name`, `slug`, `parent`, `type`) VALUES (63, 'Party', '/category/party/', 0, ''), (62, 'Kitchen', '/category/kitchen/', 61, 'sub'), (59, 'Animals', '/category/animals/', 0, ''), (64, 'Pets', '/category/pets/', 59, 'sub'), (61, 'Rooms', '/category/rooms/', 0, ''), (65, 'Zoo Creatures', '/category/zoo-creatures/', 59, 'sub'); And the following PHP: <?php include("connect.php"); echo "<ul>"; $query = mysql_query("SELECT * FROM categories"); while ($row = mysql_fetch_assoc($query)) { $catId = $row['id']; $catName = $row['name']; $catSlug = $row['slug']; $parent = $row['parent']; $type = $row['type']; if ($type == "sub") { $select = mysql_query("SELECT name FROM categories WHERE ID = $parent"); while ($row = mysql_fetch_assoc($select)) { $parentName = $row['name']; } echo "<li>$parentName >> $catName</li>"; } else if ($type == "") { echo "<li>$catName</li>"; } } echo "</ul>"; ?> Now Here's the Problem, It displays this: * Party * Rooms >> Kitchen * Animals * Animals >> Pets * Rooms * Animals >> Zoo Creatures I want it to display this: * Party * Rooms >> Kitchen * Animals >> Pets >> Zoo Creatures Is there something wrong with my loop? I just can't figure it out.

    Read the article

  • Qt/MFC Migration Framework tool: properly exiting DLL?

    - by User
    I'm using the Qt/MFC Migration Framework tool following this example: http://doc.qt.nokia.com/solutions/4/qtwinmigrate/winmigrate-qt-dll-example.html The dll I build is loaded by a 3rd party MFC-based application. The 3rd party app basically calls one of my exported DLL functions to startup my plugin and another function to shutdown my application. Currently I'm doing nothing in my shutdown function. When I load my DLL in the 3rd party app the startup function is called and my DLL starts successfully and I can see my message box. However if I shutdown my plugin and then try to start it again I get the following error: Debug Error! Program: <my 3rd party app> Module: 4.7.1 File: global\qglobal.cpp Line: 2262 ASSERT failure in QWidget: "Widgets must be created in the GUI thread.", file kernel\qwidget.cpp line 1233 (Press Retry to debug the application) Abort Retry Ignore This makes me think I'm not doing something to properly shutdown my plugin. What do I need to do to shut it down properly? UPDATE: http://doc.qt.nokia.com/solutions/4/qtwinmigrate/winmigrate-walkthrough.html says: The DLL also has to make sure that it can be loaded together with other Qt based DLLs in the same process (in which case a QApplication object will probably exist already), and that the DLL that creates the QApplication object remains loaded in memory to avoid other DLLs using memory that is no longer available to the process. So I wonder if there is some problem where I need to somehow keep the original DLL loaded no matter what?

    Read the article

  • How to compare 2 complex spreadsheets running in parallel for consistency with each other?

    - by tbone
    I am working on converting a large number of spreadsheets to use a new 3rd party data access library (converting from third party library #1 to third party library #2). fyi: a call to a UDF (user defined function) is placed in a cell, and when that is refreshed, it pulls the data into a pivot table below the formula. Both libraries behave the same and produce the same output, except, small irregularites can arise, such as an additional field being shown in the output pivot table using library #2, which can affect formulas on the sheet if data is being read from the pivot table without using GetPivotData. So I have ~100 of these very complicated (20+ worksheets per workbook) spreadsheets that I have to convert, and run in parallel for a period of time, to see if the output using the new data access library matches the old library. Is there some clever approach to do this, so I don't have to spend a large amount of time analyzing each sheet to determine the specific elements to compare? Two rough ideas that come to mind: 1. just create a Validator workbook that has the same # of worksheets, and simply do a Worbook1!Worksheet1!A1 - Worbook2!Worksheet3!A1 for every possible cell on each sheet 2. roughly the equivalent of #1, but just traverse the cells in the 2 books using VBA, and log any cells that do not match. I don't particularly like either idea, can anyone think of something better than this, maybe some 3rd party utility I could buy?

    Read the article

  • Are there any changes in the licensing of Visual Studio 2013 Express editions?

    - by Ramón García-Pérez
    As was going through reading the license.htm file provided as part of the VS2013_RTM_WebExp_ENU.iso offline installation media for the Visual Studio 2013 Express for Web, section 6 reads as follows: 6. PACKAGE MANAGER AND THIRD PARTY SOFTWARE INSTALLATION FEATURES. The software includes the following features (each a “Feature”), each of which enables you to obtain software applications or packages through the Internet from other sources: Extension Manager, New Project Dialog, Web Platform Installer, and Microsoft NuGet-Based Package Manager. Those software applications and packages are offered and distributed in some cases by third parties and in some cases by Microsoft, but each such application or package is under its own license terms. Microsoft is not developing, distributing or licensing any of the third-party applications or packages to you, but instead, as a convenience, enables you to use the Features to access or obtain those applications or packages directly from the third-party application or package providers. By using the Features, you acknowledge and agree that: you are obtaining the applications or packages from such third parties and under separate license terms applicable to each application or package (including, with respect to the package-manager Features, any terms applicable to software dependencies that may be included in the package); MICROSOFT MAKES NO REPRESENTATIONS, WARRANTIES OR GUARANTEES AS TO THE FEED OR GALLERY URL, ANY FEEDS OR GALLERIES FROM SUCH URL, THE INFORMATION CONTAINED THEREIN, OR ANY SOFTWARE APPLICATIONS OR PACKAGES REFERENCED IN OR ACCESSED BY YOU THROUGH SUCH FEEDS OR GALLERIES. MICROSOFT GRANTS YOU NO LICENSE RIGHTS FOR THIRD-PARTY SOFTWARE APPLICATIONS OR PACKAGES THAT ARE OBTAINED USING THE FEATURES. Are there any changes in the licensing of Visual Studio 2013 Express editions? If so, does this means that Visual Studio extensions installation in Express Editions is now allowed? PS: Previous versions of the Express editions did not allow the installation of extensions as per "EULA/TOS" discussed here: Limitations of Visual Studio 2012 Express Desktop

    Read the article

  • Real tortoises keep it slow and steady. How about the backups?

    - by Maria Zakourdaev
      … Four tortoises were playing in the backyard when they decided they needed hibiscus flower snacks. They pooled their money and sent the smallest tortoise out to fetch the snacks. Two days passed and there was no sign of the tortoise. "You know, she is taking a lot of time", said one of the tortoises. A little voice from just out side the fence said, "If you are going to talk that way about me I won't go." Is it too much to request from the quite expensive 3rd party backup tool to be a way faster than the SQL server native backup? Or at least save a respectable amount of storage by producing a really smaller backup files?  By saying “really smaller”, I mean at least getting a file in half size. After Googling the internet in an attempt to understand what other “sql people” are using for database backups, I see that most people are using one of three tools which are the main players in SQL backup area:  LiteSpeed by Quest SQL Backup by Red Gate SQL Safe by Idera The feedbacks about those tools are truly emotional and happy. However, while reading the forums and blogs I have wondered, is it possible that many are accustomed to using the above tools since SQL 2000 and 2005.  This can easily be understood due to the fact that a 300GB database backup for instance, using regular a SQL 2005 backup statement would have run for about 3 hours and have produced ~150GB file (depending on the content, of course).  Then you take a 3rd party tool which performs the same backup in 30 minutes resulting in a 30GB file leaving you speechless, you run to management persuading them to buy it due to the fact that it is definitely worth the price. In addition to the increased speed and disk space savings you would also get backup file encryption and virtual restore -  features that are still missing from the SQL server. But in case you, as well as me, don’t need these additional features and only want a tool that performs a full backup MUCH faster AND produces a far smaller backup file (like the gain you observed back in SQL 2005 days) you will be quite disappointed. SQL Server backup compression feature has totally changed the market picture. Medium size database. Take a look at the table below, check out how my SQL server 2008 R2 compares to other tools when backing up a 300GB database. It appears that when talking about the backup speed, SQL 2008 R2 compresses and performs backup in similar overall times as all three other tools. 3rd party tools maximum compression level takes twice longer. Backup file gain is not that impressive, except the highest compression levels but the price that you pay is very high cpu load and much longer time. Only SQL Safe by Idera was quite fast with it’s maximum compression level but most of the run time have used 95% cpu on the server. Note that I have used two types of destination storage, SATA 11 disks and FC 53 disks and, obviously, on faster storage have got my backup ready in half time. Looking at the above results, should we spend money, bother with another layer of complexity and software middle-man for the medium sized databases? I’m definitely not going to do so.  Very large database As a next phase of this benchmark, I have moved to a 6 terabyte database which was actually my main backup target. Note, how multiple files usage enables the SQL Server backup operation to use parallel I/O and remarkably increases it’s speed, especially when the backup device is heavily striped. SQL Server supports a maximum of 64 backup devices for a single backup operation but the most speed is gained when using one file per CPU, in the case above 8 files for a 2 Quad CPU server. The impact of additional files is minimal.  However, SQLsafe doesn’t show any speed improvement between 4 files and 8 files. Of course, with such huge databases every half percent of the compression transforms into the noticeable numbers. Saving almost 470GB of space may turn the backup tool into quite valuable purchase. Still, the backup speed and high CPU are the variables that should be taken into the consideration. As for us, the backup speed is more critical than the storage and we cannot allow a production server to sustain 95% cpu for such a long time. Bottomline, 3rd party backup tool developers, we are waiting for some breakthrough release. There are a few unanswered questions, like the restore speed comparison between different tools and the impact of multiple backup files on restore operation. Stay tuned for the next benchmarks.    Benchmark server: SQL Server 2008 R2 sp1 2 Quad CPU Database location: NetApp FC 15K Aggregate 53 discs Backup statements: No matter how good that UI is, we need to run the backup tasks from inside of SQL Server Agent to make sure they are covered by our monitoring systems. I have used extended stored procedures (command line execution also is an option, I haven’t noticed any impact on the backup performance). SQL backup LiteSpeed SQL Backup SQL safe backup database <DBNAME> to disk= '\\<networkpath>\par1.bak' , disk= '\\<networkpath>\par2.bak', disk= '\\<networkpath>\par3.bak' with format, compression EXECUTE master.dbo.xp_backup_database @database = N'<DBName>', @backupname= N'<DBName> full backup', @desc = N'Test', @compressionlevel=8, @filename= N'\\<networkpath>\par1.bak', @filename= N'\\<networkpath>\par2.bak', @filename= N'\\<networkpath>\par3.bak', @init = 1 EXECUTE master.dbo.sqlbackup '-SQL "BACKUP DATABASE <DBNAME> TO DISK= ''\\<networkpath>\par1.sqb'', DISK= ''\\<networkpath>\par2.sqb'', DISK= ''\\<networkpath>\par3.sqb'' WITH DISKRETRYINTERVAL = 30, DISKRETRYCOUNT = 10, COMPRESSION = 4, INIT"' EXECUTE master.dbo.xp_ss_backup @database = 'UCMSDB', @filename = '\\<networkpath>\par1.bak', @backuptype = 'Full', @compressionlevel = 4, @backupfile = '\\<networkpath>\par2.bak', @backupfile = '\\<networkpath>\par3.bak' If you still insist on using 3rd party tools for the backups in your production environment with maximum compression level, you will definitely need to consider limiting cpu usage which will increase the backup operation time even more: RedGate : use THREADPRIORITY option ( values 0 – 6 ) LiteSpeed : use  @throttle ( percentage, like 70%) SQL safe :  the only thing I have found was @Threads option.   Yours, Maria

    Read the article

  • Microsoft Declares the Future of ASP.NET is Web API

    - by sbwalker
    Sitting on a plane on my way home from Tech Ed 2012 in Orlando, I thought it would be a good time to jot down some key takeaways from this year’s conference. Some of these items I have known since the Microsoft MVP Summit which occurred in Redmond in late February ( but due to NDA restrictions I could not share them with the developer community at large ) and some of them are a result of insightful conversations with a wide variety of industry insiders and Microsoft employees at the conference. First, let’s travel back in time 4 years to the Microsoft MVP Summit in 2008. Microsoft was facing some heat from market newcomer Ruby on Rails and responded with a new web development framework of its own, ASP.NET MVC. At the Summit they estimated that MVC would only be applicable for ~10% of all new web development projects. Based on that prediction I questioned why they were investing such considerable resources for such a relative edge case, but my guess is that they felt it was an important edge case at the time as some of the more vocal .NET evangelists as well as some very high profile start-ups ( ie. Twitter ) had publicly announced their intent to use Rails. Microsoft made a lot of noise about MVC. In fact, they focused so much of their messaging and marketing hype around MVC that it appeared that WebForms was essentially dead. Yes, it may have been true that Microsoft continued to invest in WebForms, but from an outside perspective it really appeared that MVC was the only framework getting any real attention. As a result, MVC started to gain market share. An inside source at Microsoft told me that MVC usage has grown at a rate of about 5% per year and now sits at ~30%. Essentially by focusing so much marketing effort on MVC, Microsoft actually created a larger market demand for it.  This is because in the Microsoft ecosystem there is somewhat of a bandwagon mentality amongst developers. If Microsoft spends a lot of time talking about a specific technology, developers get the perception that it must be really important. So rather than choosing the right tool for the job, they often choose the tool with the most marketing hype and then try to sell it to the customer. In 2010, I blogged about the fact that MVC did not make any business sense for the DotNetNuke platform. This was because our ecosystem relied on third party extensions which were dependent on the WebForms model. If we migrated the core to MVC it would mean that all of the third party extensions would no longer be compatible, which would be an irresponsible business decision for us to make at the expense of our users and customers. However, this did not stop the debate from continuing to occur in our ecosystem. Clearly some developers had drunk Microsoft’s Kool-Aid about MVC and were of the mindset, to paraphrase an old Scottish saying, “If its not MVC, it’s crap”. Now, this is a rather ignorant position to take as most of the benefits of MVC can be achieved in WebForms with solid architecture and responsible coding practices. Clean separation of concerns, unit testing, and direct control over page output are all possible in the WebForms model – it just requires diligence and discipline. So over the past few years some horror stories have begun to bubble to the surface of software development projects focused on ground-up rewrites of web applications for the sole purpose of migrating from WebForms to MVC. These large scale rewrites were typically initiated by engineering teams with only a single argument driving the business decision, that Microsoft was promoting MVC as “the future”. These ill-fated rewrites offered no benefit to end users or customers and in fact resulted in a less stable, less scalable and more complicated systems – basically taking one step forward and two full steps back. A case in point is the announcement earlier this week that a popular open source .NET CMS provider has decided to pull the plug on their new MVC product which has been under active development for more than 18 months and revert back to WebForms. The availability of multiple server-side development models has deeply fragmented the Microsoft developer community. Some folks like to compare it to the age-old VB vs. C# language debate. However, the VB vs. C# language debate was ultimately more of a religious war because at least the two dominant programming languages were compatible with one another and could be used interchangeably. The issue with WebForms vs. MVC is much more challenging. This is because the messaging from Microsoft has positioned the two solutions as being incompatible with one another and as a result web developers feel like they are forced to choose one path or another. Yes, it is true that it has always been technically possible to use WebForms and MVC in the same project, but the tooling support has always made this feel “dirty”. The fragmentation has also made it difficult to attract newcomers as the perceived barrier to entry for learning ASP.NET has become higher. As a result many new software developers entering the market are gravitating to environments where the development model seems more simple and intuitive ( ie. PHP or Ruby ). At the same time that the Web Platform team was busy promoting ASP.NET MVC, the Microsoft Office team has been promoting Sharepoint as a platform for building internal enterprise web applications. Sharepoint has great penetration in the enterprise and over time has been enhanced with improved extensibility capabilities for software developers. But, like many other mature enterprise ASP.NET web applications, it is built on the WebForms development model. Similar to DotNetNuke, Sharepoint leverages a rich third party ecosystem for both generic web controls and more specialized WebParts – both of which rely on WebForms. So basically this resulted in a situation where the Web Platform group had headed off in one direction and the Office team had gone in another direction, and the end customer was stuck in the middle trying to figure out what to do with their existing investments in Microsoft technology. It really emphasized the perception that the left hand was not speaking to the right hand, as strategically speaking there did not seem to be any high level plan from Microsoft to ensure consistency and continuity across the different product lines. With the introduction of ASP.NET MVC, it also made some of the third party control vendors scratch their heads, and wonder what the heck Microsoft was thinking. The original value proposition of ASP.NET over Classic ASP was the ability for web developers to emulate the highly productive desktop development model by using abstract components for creating rich, interactive web interfaces. Web control vendors like Telerik, Infragistics, DevExpress, and ComponentArt had all built sizable businesses offering powerful user interface components to WebForms developers. And even after MVC was introduced these vendors continued to improve their products, offering greater productivity and a superior user experience via AJAX to what was possible in MVC. And since many developers were comfortable and satisfied with these third party solutions, the demand remained strong and the third party web control market continued to prosper despite the availability of MVC. While all of this was going on in the Microsoft ecosystem, there has also been a fundamental shift in the general software development industry. Driven by the explosion of Internet-enabled devices, the focus has now centered on service-oriented architecture (SOA). Service-oriented architecture is all about defining a public API for your product that any client can consume; whether it’s a native application running on a smart phone or tablet, a web browser taking advantage of HTML5 and Javascript, or a rich desktop application running on a PC. REST-based services which utilize the less verbose characteristics of JSON as a transport mechanism, have become the preferred approach over older, more bloated SOAP-based techniques. SOA also has the benefit of producing a cross-platform API, as every major technology stack is able to interact with standard REST-based web services. And for web applications, more and more developers are turning to robust Javascript libraries like JQuery and Knockout for browser-based client-side development techniques for calling web services and rendering content to end users. In fact, traditional server-side page rendering has largely fallen out of favor, resulting in decreased demand for server-side frameworks like Ruby on Rails, WebForms, and (gasp) MVC. In response to these new industry trends, Microsoft did what it always does – it immediately poured some resources into developing a solution which will ensure they remain relevant and competitive in the web space. This work culminated in a new framework which was branded as Web API. It is convention-based and designed to embrace native HTTP standards without copious layers of abstraction. This framework is designed to be the ultimate replacement for both the REST aspects of WCF and ASP.NET MVC Web Services. And since it was developed out of band with a dependency only on ASP.NET 4.0, it means that it can be used immediately in a variety of production scenarios. So at Tech Ed 2012 it was made abundantly clear in numerous sessions that Microsoft views Web API as the “Future of ASP.NET”. In fact, one Microsoft PM even went as far as to say that if we look 3-4 years into the future, that all ASP.NET web applications will be developed using the Web API approach. This is a fairly bold prediction and clearly telegraphs where Microsoft plans to allocate its resources going forward. Currently Web API is being delivered as part of the MVC4 package, but this is only temporary for the sake of convenience. It also sounds like there are still internal discussions going on in terms of how to brand the various aspects of ASP.NET going forward – perhaps the moniker of “ASP.NET Web Stack” coined a couple years ago by Scott Hanselman and utilized as part of the open source release of ASP.NET bits on Codeplex a few months back will eventually stick. Web API is being positioned as the unification of ASP.NET – the glue that is able to pull this fragmented mess back together again. The  “One ASP.NET” strategy will promote the use of all frameworks - WebForms, MVC, and Web API, even within the same web project. Basically the message is utilize the appropriate aspects of each framework to solve your business problems. Instead of navigating developers to a fork in the road, the plan is to educate them that “hybrid” applications are a great strategy for delivering solutions to customers. In addition, the service-oriented approach coupled with client-side development promoted by Web API can effectively be used in both WebForms and MVC applications. So this means it is also relevant to application platforms like DotNetNuke and Sharepoint, which means that it starts to create a unified development strategy across all ASP.NET product lines once again. And so what about MVC? There have actually been rumors floated that MVC has reached a stage of maturity where, similar to WebForms, it will be treated more as a maintenance product line going forward ( MVC4 may in fact be the last significant iteration of this framework ). This may sound alarming to some folks who have recently adopted MVC but it really shouldn’t, as both WebForms and MVC will continue to play a vital role in delivering solutions to customers. They will just not be the primary area where Microsoft is spending the majority of its R&D resources. That distinction will obviously go to Web API. And when the question comes up of why not enhance MVC to make it work with Web API, you must take a step back and look at this from the higher level to see that it really makes no sense. MVC is a server-side page compositing framework; whereas, Web API promotes client-side page compositing with a heavy focus on web services. In order to make MVC work well with Web API, would require a complete rewrite of MVC and at the end of the day, there would be no upgrade path for existing MVC applications. So it really does not make much business sense. So what does this have to do with DotNetNuke? Well, around 8-12 months ago we recognized the software industry trends towards web services and client-side development. We decided to utilize a “hybrid” model which would provide compatibility for existing modules while at the same time provide a bridge for developers who wanted to utilize more modern web techniques. Customers who like the productivity and familiarity of WebForms can continue to build custom modules using the traditional approach. However, in DotNetNuke 6.2 we also introduced a new Service Framework which is actually built on top of MVC2 ( we chose to leverage MVC because it had the most intuitive, light-weight REST implementation in the .NET stack ). The Services Framework allowed us to build some rich interactive features in DotNetNuke 6.2, including the Messaging and Notification Center and Activity Feed. But based on where we know Microsoft is heading, it makes sense for the next major version of DotNetNuke ( which is expected to be released in Q4 2012 ) to migrate from MVC2 to Web API. This will likely result in some breaking changes in the Services Framework but we feel it is the best approach for ensuring the platform remains highly modern and relevant. The fact that our development strategy is perfectly aligned with the “One ASP.NET” strategy from Microsoft means that our customers and developer community can be confident in their current and future investments in the DotNetNuke platform.

    Read the article

  • allow client using webpage to run and use 1 server side executable

    - by richardboon
    In simplest term here’s what I must do: When user connects to a webpage (port 80) via their browser, the web server will run a customized-proprietary third party windows executable [located on the server]; then display and allow the user full control to that program (inside the browser). Note: I cannot rewrite/redistribute that 3rd party desktop gui program.

    Read the article

  • Is it possible to include external dlls in a silverlight project, that weren't built for silverlight

    - by Josh Smeaton
    We're trying to work out a deployment strategy for an internal tool that we're creating. It's a control that uses a set of 3rd party libraries to communication with a specific type of server. So I have a class library built that uses these libraries. I was looking to deploy a silverlight application that uses our library, which in turn uses a set of 3rd party libraries. When trying to reference the 3rd party libraries, I get the "library can not be referenced because it was not built as a silverlight library". The type of project I'm trying to build is a silverlight application - not a dll library or anything else. Is it possible to create a silverlight application using libraries created outside of silverlight?

    Read the article

  • BAD_UID error while exporting key in CryptoAPI

    - by mindthief
    Hi all, I am writing a test application for Microsoft CryptoAPI. I want to export the secret key of one party using the public key of the second party, and then import that secret key as the second party's secret key (this sets up a shared secret key for communication). Here is my code: if(!CryptExportKey(encryptT->hSymKey, decryptT->hPubKey, SIMPLEBLOB, 0, keyExBuf, &bufLen)) { FormattedDebugPrint(NULL, GetLastError(), "could not export secret key", TRUE); return -1; } if(!CryptImportKey(decryptT->hCryptProv, keyExBuf, bufLen, decryptT->hPubKey, 0, &(decryptT->hSymKey))) { FormattedDebugPrint(NULL, GetLastError(), "could not import secret key", TRUE); return -1; } And this gives the error: 80090001: Bad UID. The public keypair is being generated for both encryptT and decryptT (sender, receiver) by calling: CryptGenKey(encryptT->hCryptProv, CALG_RSA_KEYX, CRYPT_EXPORTABLE, &(encryptT->hPubKey)) Any idea what could be causing the error? Thanks,

    Read the article

  • Using XML to generate SAP ABAP and/or SAPScript?

    - by Rob
    Has anyone got examples and/or experience of generating SAP ABAP or SAPScript form code from XML that came from an external application? This would help: creation of SAP-based applications in a data-driven way by automating the knowledge to do so from the export of XML from an external application automated inputting of knowledge from an external application into SAP applications, rather than manually copying between systems enable 3rd-party external tools to be used to create data, perhaps in a more easier-to-use way than could be done in SAP. Or if there was already heavy investment in training with these third party tools rather than SAP, or if the employment market favoured staff with knowledge of these tools enable creation of data for multiple purposes, views: those in SAP and outside SAP. enable inter-operability of SAP with 3rd-party external tools I'm looking for: - experiences as to the feasibility - tools, e.g. parsers, XSLT etc. - examples

    Read the article

  • Assembly reference in Silverlight class library and used only in xaml is not packaged in XAP

    - by Brandon Copeland
    I have a 3rd party library (A). That library is referenced in my Silverlight class library (B). That Silverlight class library is referenced in my Silverlight application (C). The 3rd party library is not explicitly referenced in the Silverlight application. It seems that "A" is added to my XAP if "A" is used in any class in "B" because of a chain in dependencies (C - B - A). This is the behavior I would expect and need. If "A" is never explicitly used in a C# class but only defined in Xaml, the assembly is not packaged to the XAP. Maybe "A" includes a control that is only used declaratively and never referenced otherwise. Is this behavior by design? Am I missing a property somewhere that controls this? I would prefer to not explicitly reference the third party library in my Silverlight application. What's to best practice to ensure all necessary assemblies are packaged in the XAP?

    Read the article

  • Handling credit cards and IOS

    - by Susan Jackie
    I am using NSUrlConnection asyncronous request to transmit credit card information to a secure third party server. I do the following: I get the credit card number, cvv, etc from the uitextfields. Encode the credit card information into a json format. Set as httpd body of the nsurlconnection request as follows: NSURL * url = [[NSURL URLWithString: "https://www.example.com"]; NSMutableURLRequest * request = [[NSMutableURLRequest alloc] initWithURL: url]; [request setHTTPMethod: @"POST"]; [request setValue:@"application/json" forHTTPHeaderField:@"Accept"]; [request setValue:@"application/json" forHTTPHeaderField:@"Content-Type"]; [request setHTTPBody: [NSJSONSerialization dataWithJSONObject: params options: kNilOptions error: &parseError]]; Send this information via asynchronous request to a secure third party server: [NSURLConnection sendAsynchronousRequest:request queue: queue completionHandler:^(NSURLResponse *response, NSData *data, NSError * requestError) { What should I be considering to send user credit card information to a third party server using nsurlconnection asynchronous request?

    Read the article

  • Setting image DPI in relation to height/width C#

    - by Aaron
    Hi, I'm writing an application to send some images to a third party, and the images must be 200x200 DPI. The image is a Bitmap and is sized at 500 width and 250 height. The first time I tested the images with the third party, my resolution was incorrect. I merely used image.SetResolution(200,200) to correctly set it to 200x200. This, however, only changed the resolution tag for the image and did not properly, according to my third party technical contact, adjust the image height and width. Is there a ratio that I can use so that for each X units I increment the resolution, I merely increment the corresponding height or width Y units? I thought that I could just increment resolution without having to increment height or width. Thank you, Aaron.

    Read the article

  • Single Sign on using Shibboleth

    - by user196614
    Hi, I have to implement Single Sign On in my .NET(3.5) project using Shibboleth. Detailed requirement goes this way: 1) I have developped a web application using .NET (3.5) named "abc.com". 2) There are some third party applications which will be launched from "abc.com" 3) If I have logged in to "abc.com" and now if I launch any of the supported third party applications then it should not ask for login information again. From last few days I have been reading Shibboleth from https://spaces.internet2.edu/display/SHIB2/Home I have also installed Identity provider (IdP) and Service Provide (SP) from https://spaces.internet2.edu/display/SHIB2/Installation Still I am unable to make out how my "abc.com", third party application and Shibboleth will fit into one picture? Can anyone please guide me?

    Read the article

  • ActiveX communication

    - by Mario Marinato -br-
    I'm developing an ActiveX EXE that exposes an specific class to a third-party software. This third-party software instanciates an object of this class and uses its methods. Strangely, this third-party software destroys its object of my exposed class as soon as it calls an specific method, but I have no idea why this happens. The only clue I have is that this method is the only one that returns a value. All the other ones are simple 'subs' that do not return any value, and when they are called nothing wrong happens. I'm using VB6. Do you guys have any idea of why it's happening?

    Read the article

  • Expression.OrElse, dynamically creating a condition.

    - by Jim
    Hi, I am attempting to create a dynamic where clause using the standard expression API. var query = ( from p in Parties orderby p.PartyId orderby p.FullName select p ).AsQueryable(); Expression<Func<Party, bool>> @fn = (p) => SqlMethods.Like(p.FullName, "%smith%") || SqlMethods.Like(p.Person.FirstName, "%smith%"); Expression<Func<Party, bool>> @sn = (p) => SqlMethods.Like(p.Person.FirstName, words[0]); ParameterExpression pe = Expression.Parameter(typeof(Party), "p"); Expression orelse = Expression.OrElse( Expression.Lambda(@fn, pe), Expression.Lambda(@sn, pe) ); The expressions above will ultimately be added to a where clause. I need to add a bunch of 'likes'. How do I do this? I get InvalidOperationException on the operator OrElse I have also tried Expression.Or Thanks Regards Craig.

    Read the article

  • How do I display non-normalized data in a hierarchical structure?

    - by rofly
    My issue is that I want to display data in a hierarchal structure as so: Democrat County Clerk Candidate 1 Candidate 2 Magistrate Candidate 1 Candidate 2 Candidate 3 But I'm retrieving the dataset like this: Party | Office | Candidate Democrat | County Clerk | Candidate 1 Democrat | County Clerk | Candidate 2 Democrat | Magistrate | Candidate 1 Democrat | Magistrate | Candidate 2 Democrat | Magistrate | Candidate 3 I planned on using nested repeaters, but I need a distinct value of Party, and then distinct values of office name within that party in order to do it. Are there any .NET functions to easily do what I'm attempting to? Would there be a better way of displaying the information other than repeaters? Thanks in advance!

    Read the article

  • Problem doing SVN Vendor Branch - merge

    - by Gyan
    Hi, I am trying to use the svn vendor branch to upgrade the third party library. (We have modified the source code) I followed all the steps to create the vendor branch:: created the vendor branch for old version (3rd party library) created the vendor branch for latest version (3rd party library) copied the latest version to current folder using (usign svn_load_dirs.pl script) structure of vendor repository in svn URL/vendor/library/3.5.0 URL/vendor/library/3.7.0 URL/vendor/library/current I have the library-3.5.0 used/modified at URL/trunk/library/customized-library I have a problem when I try to merge the difference between URL/vendor/library/3.7.0 and URL/vendor/library/3.5.0 to URL/trunk/library/customized-library I am at the folder where URL/trunk/library/customized-library is checked out and I use following command to do the merge svn merge URL/vendor/library/3.5.0 URL/vendor/library/current . --accept PARAMETERS when I use theirs-conflict for accept parameter, It ignores all of my changes to the old version and copies files from 3.7.0 when I user mine-conflict, it ignores the files in 3.7.0 when I use postpone, it throws exception "tree conflict" Thanks Gyan

    Read the article

  • Is a new thread in a Visual Studio test project aborted when the test ends?

    - by Michel
    Hi, i have to do some message exchange with a 3rd party (in a website). When the client posts a page, i start the message exchange. When that doesn't succeed for some reason, i report this to the client by rendering the page with a message. On the background, in a separate thread, i start a process to send abort messages to the 3rd party. I can't do this while the user is waiting for the page to come back, because it might take a few minutes. But in a test project, the test ends when the message to the 3rd party is sent, and after the new thread is started. But it seems that the new thread also ends, when the test is done. Is that normal behaviour? I do start the thread in a new class with a reference to 2 objects from the class which tries to send the message in the first place, may that be a problem?

    Read the article

  • How to get original url after HttpContext.RewritePath() has been called.

    - by jessegavin
    I am working on a web app which makes use of a 3rd party HttpModule that performs url rewriting. I want to know if there's any way to determine the original url later on in Application_BeginRequest event. For example... Original url: http://domain.com/products/cool-hat.aspx Re-written url (from 3rd party httpmodule): http://domain.com/products.aspx?productId=123 In the past I have written HttpModules that store the original url in HttpContext.Items but, this is a 3rd party app and I have no way of doing that. Any ideas would be appreciated.

    Read the article

  • Permissions error for a signed Java applet when including external JAR files

    - by sri
    I have a signed Java applet. And it works fine. But now I have to integrate some 3rd party JAR files with it. When I test it from Eclipse, the whole thing works correctly. But when I test it as an applet, it gives me a FilePermission error. I thought this was because those 3rd party JAR files don't have a java.policy.applet within them. But manually adding the policy file doesn't get rid of the error. What am I missing? Thanks! ============================ All the 3rd party JAR files sit on the server filesystem like so: A.jar, B.jar, C.jar. And I include them in the applet tag like so: <applet archive="my.jar,A.jar,B.jar,C.jar"> </applet> Also, in the MANIFEST/MANIFEST.MF file in my.jar, I include those JAR files like so: Class-Path: A.jar,B.jar,C.jar

    Read the article

  • Implement VOIP in iphone and ipad.

    - by user339994
    Hi, Does anybody is aware of implementing VOIP feature in iphone and ipad. The Things for which i need clarity is, By using which third party library/protocol can i implement this feature? or is der any in built classed available in objective c which i can make use of?? Is there any Apple store accepted iphone application which uses VOIP iplementation?? If so where can i get implementation details of it. This is a generic question. Can we use any third party's in our iphone application or do we need to get any special permission from Apple reg the usage of third party's used. Please let me know if needed more details reg this. Thanks , Santhana

    Read the article

  • Maven overlay with war not generated by maven

    - by dnc253
    I'm pretty new to Maven, so I apologize if this is a newbie question. We are trying to integrate a 3rd-party app into ours. This 3rd-party functionality is delivered to us as a war file. As part of this integration, I want to add some extra jars and and a properties file. Googling around, I discovered overlays. However, in all the examples I've seen, it looks the the wars that are being overlayed, were themselves generated by Maven, and thus Maven can figure out dependencies, conflicts, etc. between the two. I'm just wondering if there's a way for me to overlay this 3rd-party war with this extra stuff I want. If so, what would that look like in the pom.xml?

    Read the article

  • Hibernate mapping to manage bidirectional relationship using intermediate table

    - by shikarishambu
    I have an object hierarchy that is as follows. Party inherited by Organization and Person Organization inherited by Customer, Vendor Person inherited by Contact In the database I have the following tables Party, Customer, Vendor, Contact. All of them have a corresponding row in Party table. Contact belongs to either Vendor or Customer. I have a field on the Contact table for org_party_id. However, since the organization can be either a Customer or Vendor I need to be able to look at different tables. Is there a way to map this in hibernate? Or, a better way to manage it in DB/ hibernate?

    Read the article

  • Hashing and salting values

    - by Avanst
    I am developing a small web app that internally authenticates users. Once the user is authenticated my web app then passes some information such as userID and Person's name to a third party web application. The third party developer is suggesting that we hash and salt the values. Forgive my ignorance, but what exactly does that mean? I am writing the app in Java. So what I am planning on doing is hashing the userID, Person's name, and some Math.random() value as the salt with Apache Commons Digest Utils SHA512 and passing that hashed string along with the userID and person's name. Is that the standard practice? I should be passing the third party the salt as well correct?

    Read the article

< Previous Page | 17 18 19 20 21 22 23 24 25 26 27 28  | Next Page >