Search Results

Search found 19630 results on 786 pages for 'enterprise applications'.

Page 243/786 | < Previous Page | 239 240 241 242 243 244 245 246 247 248 249 250  | Next Page >

  • Redirect application's graphical output in Windows Vista/7 (with DWM)

    - by puri
    I want to create a desktop manager that takes informations of all running applications including states and screenshots to display and manipulate them in my virtual space (probably in 3D). It can be considered as another layer of abstraction on top of Windows itself. Because many native Windows Vista/7 features like Flip 3D and Live Thumbnails are able to show each window's activities in real-time e.g. video keeps playing in taskbar's thumbnail, I think DWM allows an application to redirect its output to somewhere else or some special processes are able to collect other applications' graphical outputs (maybe of child processes only). Has Microsoft released a set of public APIs to do so? If not, is it technically possible? And is it easier if I limit my scope to just .NET applications with WPF?

    Read the article

  • Use Tomcat with Java SecurityManager?

    - by pauline
    I'm writing a web application that is supposed to run on Tomcat on Ubuntu. On Ubuntu, Tomcat is per default configured to run with the Java SecurityManager. Besides my own web application, there will only be some well known third party web applications related to my own, like the BIRT report engine. If one of the web applications fails or gets compromised, it may take down all the others without harm, because they all belong together. What I don't wont to happen is that a compromised web app compromises the system itself, like calling rm -r / Do I need to use the java security manager to achieve this? Or is it only necessary to protect one web app from the other? I'd really like to prevent the effort to create .policy files for all the 3rd party web applications I intend to use.

    Read the article

  • Private Cloud: Putting some method behind the madness

    - by Sudip Datta
    Finally, I decided to join the blogging community. And what could be a better time to start than the week after OpenWorld 2012. 50K+ attendees, demonstrations, speaker sessions and a whole lot of buzz on Oracle Cloud..It was raining clouds in this year's Openworld. I am not here to write about Oracle's cloud strategy in general, but on Enterprise Manager's cloud management capabilities. This year's Openworld was the first after we announced the 12c Cloud Control and we were happy to share the stage with quite a few early adopters. Stay tuned for videos from our customers and partners, I will post them as they get published. I met a number of platform administrators in Oracle-DBAs, Middleware Admins, SOA Admins...The cloud has affected them all, at least to the point where it beckoned more than just curiosity..Most IT infrastructure are already heavily virtualized (on VMWare and on others including Oracle VM), and some would claim they are already on “cloud” (at least their Sysadmins told them so). But none of them were confident of the benefits because their pain points continued to grow.. Isn't cloud supposed to ease those? Instead, they were chasing hundreds of databases running on hundreds of VMs, often with as much certainty propounded by Heisenberg. What happened to the age-old IT discipline around administration, compliance, configuration management? VMs are great for what they are. I personally think they have opened the doors to new approaches in which an application stack gets provisioned and updated. In fact, Enterprise Manager 12c is possibly the only tool out there that can provision full-fledged application as VM Assemblies. In this year's Openworld, customers talked on how they provisioned RAC and Siebel assemblies, which as the techies out there know, are not trivial (hearing provisioning time for Siebel down from weeks to hours was gratifying indeed). However, I do have an issue with a "one-size fits all" approach to cloud. In a week's span, I met several personas: Project owners requiring an EC2 like VM instance for their projects Admins needing the same for Sparc-Solaris. DBAs requiring dedicated databases for new projects APEX Developers needing just a ready-to-consume schema as a service Java Developers looking for a runtime platform QA engineers needing a fast clone of their production environment If you drill down further, you will end up peeling more layers of the details. For example, the requirements for Load testing and Functional testing are very different. For Load testing the test environment should ideally be the same as the production. You shouldn't run production on Exadata and load test on a VM; they will just not be good representations of one another. For Functional testing it does not possibly matter. DBAs seem to be at the worst affected of the lot. It seems they have been asked to choose between agile provisioning and  faster runtime performance. And in some cases, it is really a Hobson's choice, because their infrastructure provider made no distinction between the OLTP application and the Virtual desktop! Sad indeed. When one looks at the portfolio of services that we already offer (vanilla IaaS, VM Assembly based PaaS, DBaaS) or have announced (Java PaaS, Instant Cloning, Schema-aaS), one can possibly think that we are trying to be the "renaissance man" ! Well I would have possibly digested that had it not been for the various personas that I described above. Getting the use cases right is very important for an application such as cloud management. We iterate and iterate over these over and over again and re-validate them in CABs (Customer Advisory Boards). We consider over the major aspects of tenancy: service placement, resource isolation (can a tenant execute an expensive SQL and run away with all the resources), quota and security. We, in Engineering, keep reminding ourselves that we are dealing with enterprise clouds. We owe it to our customer base ! In the coming posts, I will drill down more into each of the services. In the meanwhile, here are some collateral and  demos for starters with EM 12c. http://www.oracle.com/technetwork/oem/cloud-mgmt/index.html Sudip Datta The views expressed here are my own and do not necessarily reflect the views of Oracle. Stay Connected: Twitter | Facebook | YouTube | Linkedin | Newsletter --

    Read the article

  • Partial Git deployment strategy?

    - by MatW
    I need to setup a Kohana dev environment that allows me to make full use of shared module / system classes across separate applications. Each application typically belonging to a different client. I use Git for source control, but am struggling to come up with a clean deployment method that will allow me to pull only those parts of the dev environment specific to a client / app down into that client's production environment (assuming that the client's production environment will have Git installed). Dev enviroment: - kohana - applications - clientapp1 - clientapp2 - modules - public_html - clientapp1 - clientapp2 - system - 3.0.1 - 3.0.5 Client 1's production environment: - / - applications - clientapp1 - modules - public_html - client_app1 - system - 3.0.5 Naturally, I want to have total control over each client "sub repo" as if it were an independent repo (in terms of gitignore, etc). I have seen topics that cover Git's sparse checkout feature, but it seems like it may cause a few problems down the line from a maintenance point of view, and I don't like the idea of the entire repo's metadata existing in client's production environment repo. As you can probably tell, I'm not exactly a Git poweruser, so any suggestions / wisdom are very welcome!

    Read the article

  • Right-Time Retail Part 2

    - by David Dorf
    This is part two of the three-part series. Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} Right-Time Integration Of course these real-time enabling technologies are only as good as the systems that utilize them, and it only takes one bottleneck to slow everyone else down. What good is an immediate stock-out notification if the supply chain can’t react until tomorrow? Since being formed in 2006, Oracle Retail has been not only adding more integrations between systems, but also modernizing integrations for appropriate speed. Notice I tossed in the word “appropriate.” Not everything needs to be real-time – again, we’re talking about Right-Time Retail. The speed of data capture, analysis, and execution must be synchronized or you’re wasting effort. Unfortunately, there isn’t an enterprise-wide dial that you can crank-up for your estate. You’ll need to improve things piecemeal, with people and processes as limiting factors while choosing the appropriate types of integrations. There are three integration styles we see in the retail industry. First is batch. I know, the word “batch” just sounds slow, but this pattern is less about velocity and more about volume. When there are large amounts of data to be moved, you’ll want to use batch processes. Our technology of choice here is Oracle Data Integrator (ODI), which provides a fast version of Extract-Transform-Load (ETL). Instead of the three-step process, the load and transform steps are combined to save time. ODI is a key technology for moving data into Retail Analytics where we can apply science. Performing analytics on each sale as it occurs doesn’t make any sense, so we batch up a statistically significant amount and submit all at once. The second style is fire-and-forget. For some types of data, we want the data to arrive ASAP but immediacy is not necessary. Speed is less important than guaranteed delivery, so we use message-oriented middleware available in both Weblogic and the Oracle database. For example, Point-of-Service transactions are queued for delivery to Central Office at corporate. If the network is offline, those transactions remain in the queue and will be delivered when the network returns. Transactions cannot be lost and they must be delivered in order. (Ever tried processing a return before the sale?) To enhance the standard queues, we offer the Retail Integration Bus (RIB) to help the management and monitoring of fire-and-forget messaging in the enterprise. The third style is request-response and is most commonly implemented as Web services. This is a synchronous message where the sender waits for a response. In this situation, the volume of data is small, guaranteed delivery is not necessary, but speed is very important. Examples include the website checking inventory, a price lookup, or processing a credit card authorization. The Oracle Service Bus (OSB) typically handles the routing of such messages, and we’ve enhanced its abilities with the Retail Service Backbone (RSB). To better understand these integration patterns and where they apply within the retail enterprise, we’re providing the Retail Reference Library (RRL) at no charge to Oracle Retail customers. The library is composed of a large number of industry business processes, including those necessary to support Commerce Anywhere, as well as detailed architectural diagrams. These diagrams allow implementers to understand the systems involved in integrations and the specific data payloads. Furthermore, with our upcoming release we’ll be providing a new tool called the Retail Integration Console (RIC) that allows IT to monitor and manage integrations from a single point. Using RIC, retailers can quickly discern where integration activity is occurring, volume statistics, average response times, and errors. The dashboards provide the ability to dive down into the architecture documentation to gather information all the way down to the specific payload. Retailers that want real-time integrations will also need real-time monitoring of those integrations to ensure service-level agreements are maintained. Part 3 looks at marketing.

    Read the article

  • Create an own "OpenID-like system" Provider

    - by user502052
    I know that Facebook use their own OpenID-like system called "Facebook connect", which you can use to authenticate users on your site, among other features. In my case I have multiple Ruby on Rails applications: users.example.com profiles.example.com photos.example.com ... I would like to use 'users.example.com' as a web service that allows users to authenticate to all my other applications the same way as works "Facebook connect" or OpenID. In few words, 'users.example.com' must works as a "OpenID-like system" for my applications in 'example.com'. Can anyone give me tips and links to some useful resources? P.S.: since I am a newbie in this matter, I do not know if I'm saying things that make sense. So someone could help me to understand (if I am wrong) ...

    Read the article

  • No Carbon Human-Interface-Toolbox in OSX 64-bit binaries?

    - by yairchu
    I get the impression that Carbon Human Interface Toolbox does not work in 64-bit binaries. Apple's documentation says: The Carbon Help Manager is not available to 64-bit applications. ... The Control Manager is not available to 64-bit applications. ... The Data Browser is not available to 64-bit applications. ... I just want to verify that: There is no work-around around this. If this is simply the case. Why don't Apple's documentation simply state it as such?

    Read the article

  • Which jsPerf-test should I consider as standard for checking the performance of javascript template-engines

    - by bhargav
    I am on a search for a javascript template engine that has good performance when used in large js applications and is also very suitable for mobile applications. So I have gone through the various jsPerf-tests for these. There seems to be a lot which show different results and it is confusing to find out which is the standard test. Can some one guide me a standard jsPerf that I can refer to and that should also include following templates dust, underscore, hogan, mustache, handlebars. From what I have observed dot.js is a constant performer with good rendering speed, but is it mature enough for larger applications ? What is this "with" and "no with" that is shown in the jspref tests? Can some one explain. In all the tests I have seen popular templates like mustache, handlebars, dust, hogan,etc seems to be behind performance than other templates, so why people are using them leaving out the top performers,is it because of maturity of these template engines? Thanks in advance

    Read the article

  • iPhone Full and Lite version without StoreKit

    - by beryllium
    Hi there! I have a Full and Lite applications that were built from the same code. Lite version has a button Upgrade. What code I should place in button's handler for checking users payment and update application to Full version?? I know StoreKit framework that allow to unblock some features, but I need just 2 different applications. Maybe there is tutorial on this topic, but I found nothing. If anyone has reference link pls provide None of those applications has not yet uploaded to Appstore. Thanks.

    Read the article

  • Deploying .NET components (upgrade the component for only a single client application)

    - by Vytas999
    I use Visual Studio .NET to create a component that will be shared by two client applications. Eventually, I plan to deploy new version of this component. However, not all of the new versions will be compatible with both client applications. When I deploy component and the client applications, I must ensure that I can upgrade the component for a single client application. I must also minimize the need for configuration changes when I deploy new version of the component. What are possible ways to achieve this goal?

    Read the article

  • PHP Connect to 4D Database

    - by Matt Reid
    Trying to connect to 4D Database. PHPINFO says PDO is installed etc etc... Testing on localhost MAMP system. However when I run my code I get: Fatal error: Uncaught exception 'PDOException' with message 'could not find driver' in /Applications/MAMP/htdocs/4d/index.php:12 Stack trace: #0 /Applications/MAMP/htdocs/4d/index.php(12): PDO->__construct('4D:host=127.0.0...', 'test', 'test') #1 {main} thrown in /Applications/MAMP/htdocs/4d/index.php on line 12 My code is: $dsn = '4D:host=127.0.0.1;charset=UTF-8'; $user = 'test'; $pass = 'test'; // Connection to the 4D SQL server $db = new PDO($dsn, $user, $pass); try { echo "OK"; } catch (PDOException $e) { die("Error 4D : " . $e->getMessage()); } Can't put my finger on the error, i'm using the settings under the PHP tab... Thank you.

    Read the article

  • Fastest way to deploy rails apps with Passenger

    - by yuval
    I am working on a Dreamhost server with Rails 2.3.5. Every time I make changes to a site, I have to ssh into the site, remove all the files, upload a zip file containing all the new files for the site, unzip that file, migrate the database, and go. Something tells me there's a faster way to deploy rails apps. I am using mac Time Machine to keep track of different versions of my applications. I know git tracks files, but I don't really know how to work with it to deploy my applications, since passenger takes care of all the magic for me. What would be a faster way to deploy my applications (and avoid the downtime associated with my method when I delete all files on the server)? Thanks!

    Read the article

  • Should I have a separate copy of all CakePHP files for every new application?

    - by BicMan
    I'm extremely new to CakePHP. From what I've gathered, it seems like I can have multiple applications that all share the same app and cake directories. So, let's say I have two applications. CakeFacebookApp and GenericCakeBlog. These applications are completely separate from each other and will have completely separate URLs, but they will reside on the same webhost. Should they both be within the same cake structure, or should they each have a full cake install in separate directories? Technically, I'm sure it will work either way, but I guess I'm looking for a best practice approach. Thanks.

    Read the article

  • iPhone c++ development / compiler on a non-Mac PC? (Windows? Linux?)

    - by Ehrann Mehdan
    According to the (in)famous iPhone Developer Program License Agreement change 3.3.1 — Applications may only use Documented APIs in the manner prescribed by Apple and must not use or call any private APIs. Applications must be originally written in Objective-C, C, C++, or JavaScript as executed by the iPhone OS WebKit engine, and only code written in C, C++, and Objective-C may compile and directly link against the Documented APIs (e.g., Applications that link to Documented APIs through an intermediary translation or compatibility layer or tool are prohibited). So it is allowed to develop iPhone apps using C++ My questions Is there a compiler / IDE for developing iPhone apps using C++? Is that compiler / IDE available on non Mac environments? (Windows? Linux?) If not, why? I mean an eclipse C++ plugin for iPhone development will be quite popular, or is there already any serious attempt to do that?

    Read the article

  • How do I design a web service (microsoft) that can be consumed by multiple end points?

    - by Ben McCormack
    My company is planning to implement a solution in multiple applications that will help to validate mailing addresses at the point of data entry. We're using UPS's Extended Address Validation (XAV) web service API to validate the addresses. Our current plan is to build a .NET web service that can be used to communicate between our applications and the UPS API. We have applications in VB6, classic ASP, and .NET 2.0, so we'd like to implement a solution that can be easily consumed by each of these programming environments. What are our (Microsoft) options for designing a web service that can be consumed by multiple clients? In particular, is there a way to design a single web service that can respond with JSON (in case we want to validate our web page using javascript) in addition to XML? I'm new to designing web services and want to make sure we consider all of our options. I've heard terms like asmx, WCF, OData, etc., but I don't know which frameworks will support what we're trying to do and where to start.

    Read the article

  • Database Partitioning and Multiple Data Source Considerations

    - by Jeffrey McDaniel
    With the release of P6 Reporting Database 3.0 partitioning was added as a feature to help with performance and data management.  Careful investigation of requirements should be conducting prior to installation to help improve overall performance throughout the lifecycle of the data warehouse, preventing future maintenance that would result in data loss. Before installation try to determine how many data sources and partitions will be required along with the ranges.  In P6 Reporting Database 3.0 any adjustments outside of defaults must be made in the scripts and changes will require new ETL runs for each data source.  Considerations: 1. Standard Edition or Enterprise Edition of Oracle Database.   If you aren't using Oracle Enterprise Edition Database; the partitioning feature is not available. Multiple Data sources are only supported on Enterprise Edition of Oracle   Database. 2. Number of Data source Ids for partitioning during configuration.   This setting will specify how many partitions will be allocated for tables containing data source information.  This setting requires some evaluation prior to installation as       there are repercussions if you don't estimate correctly.   For example, if you configured the software for only 2 data sources and the partition setting was set to 2, however along came a 3rd data source.  The necessary steps to  accommodate this change are as follows: a) By default, 3 partitions are configured in the Reporting Database scripts. Edit the create_star_tables_part.sql script located in <installation directory>\star\scripts   and search for partition.  You’ll see P1, P2, P3.  Add additional partitions and sub-partitions for P4 and so on. These will appear in several areas.  (See P6 Reporting Database 3.0 Installation and Configuration guide for more information on this and how to adjust partition ranges). b) Run starETL -r.  This will recreate each table with the new partition key.  The effect of this step is that all tables data will be lost except for history related tables.   c) Run starETL for each of the 3 data sources (with the data source # (starETL.bat "-s2" -as defined in P6 Reporting Database 3.0 Installation and Configuration guide) The best strategy for this setting is to overestimate based on possible growth.  If during implementation it is deemed that there are atleast 2 data sources with possibility for growth, it is a better idea to set this setting to 4 or 5, allowing room for the future and preventing a ‘start over’ scenario. 3. The Number of Partitions and the Number of Months per Partitions are not specific to multi-data source.  These settings work in accordance to a sub partition of larger tables with regard to time related data.  These settings are dataset specific for optimization.  The number of months per partition is self explanatory, optimally the smaller the partition, the better query performance so if the dataset has an extremely large number of spread/history records, a lower number of months is optimal.  Working in accordance with this setting is the number of partitions, this will determine how many "buckets" will be created per the number of months setting.  For example, if you kept the default for # of partitions of 3, and select 2 months for each partitions you would end up with: -1st partition, 2 months -2nd partition, 2 months -3rd partition, all the remaining records Therefore with records to this setting, it is important to analyze your source db spread ranges and history settings when determining the proper number of months per partition and number of partitions to optimize performance.  Also be aware the DBA will need to monitor when these partition ranges will fill up and when additional partitions will need to be added.  If you get to the final range partition and there are no additional range partitions all data will be included into the last partition. 

    Read the article

  • Eager loading in EF1.0

    - by Dave Swersky
    I have a many-to-many relationship: Application - Applications_Servers - Server This is set up in my Entity Data Model and all is well. My problem is that I'd like to eager-load the whole graph of Applications so that I have an IEnumerable<Applications>, each Application member populated with the Servers collection associated by the many-to-many relationship. Normally this wouldn't be a problem, but according to my research there must be a navigation property between Application and Server. This is not the case for me because my Applications_Servers join table has more in it than just the two keys. Therefore, there is no navigation property directly between Application and Server, and this doesn't work: var apps = (from a in context.Application.Include("Server") select a).ToList(); I get an error saying there is no navigation property on Application called "Server", and that's correct, there is none. How do I write the query to eager-load my Applications with their Servers in this case?

    Read the article

  • how to get game sound or how to create for iPhone

    - by iPhone Fun
    Hi all, I am having problem of getting sound. i.e. how to get good sounds for our iPhone games? or how to make some good quality sounds for iPhone game applications so that the applications will become some more attractive? There are many sites like www.freesound.org , but I am not able to get good sounds as per my choice. If any one know from where we can get good sounds or how to make such things please help me. As this will be help full to all for using sounds in the applications of iPhone or other. Thanks in advance

    Read the article

  • Input string was not in correct format.

    - by jawahar-sync
    Hi Experts, I had a Ribbon like custom control. I created an application using this control. These applications work fine until I change the system number format. They crashes if I change the system number format like below: Current Format = English (United State) Decimal symbol = ',' Digit group symbol = '.' When I run applications, they throw an exception "Input string not in Correct format". . Some other applications specify the Exception's message = "Input string '0,2,0,2' was not a correct format", so I think in Wpf ES's xaml files, we may declare some properties i.e Padding, Margin like "0,2,0,2" = that will cause errors with the system number format above. I have note that this error only occurs in Windows Vista, it does not occurs on Windows XP. I do not know why? I have also look at this link, But it not helps for Vista. http://support.microsoft.com/kb/968227/en-us

    Read the article

  • Can I attach a .NET TraceListener to an externally running process?

    - by BBlake
    I am developing an application scheduling program that will run other applications using System.Diagnostics.Process. The external applications are of various types (some .NET and some not). For those external apps that have trace logging enabled, is there a way that I can attach the tracelistener of the parent/calling application to listen to and record all the trace output from the child/called application to the parent application's trace output? This is not primarily for debugging purposes. This is more to track trace output from all the various scheduled applications by collecting it into one place as much as possible. The scheduler app is still in the early design stages, but will be .NET, and I'm trying to clear up potential design issues before I get into it too far.

    Read the article

  • Database permissions and ORMs

    - by Jonn
    I've been using .NET's Entity Framework a lot lately and have absolutely no wish to go back to using Stored Procedures. Been shocked though that the company I'm building this project for had a policy where applications were only given accounts that only had permissions to access stored procedures! Apparently, they believe that there's a security risk involved in allowing applications to access the tables/views directly. I don't get this. My first question is, can someone enlighten me as to what kind of security risk applications having direct access to the database may pose? AND If that's the case, are there any other ORM solutions that can provide a workaround to this (I can't think of any logical possibility atm) that would allow me to circumvent the restrictions on the user account to be assigned to me? OR is my understanding that I'd need direct permissions for the tables and views wrong?

    Read the article

  • Get list of running processes, get active process (and it's application) Flex/AIR

    - by Adam Kiss
    Hello, just a quickie (or maybe not:] ): Is it possible to get somehow list of running applications/processes and, while running in background, check which process is active? Additionally - if somehow, the answer was yes, is it possible to react for change of active window / application react just as if it was Event, or bind to it custom event (e.g. Event.SystemActiveAppChange)? Thank you for answers as well as pointers. EDIT: Due to probable missunderstanding, I mean local applications - on your win/mac/linux machine - I would like to (in process of learning of language) track what apps I use the most, make a little graph maybe? So, the point is: in AIR app, developed in FLEX, I would like to get/list all running applications/processes, as well as which one is active (on user's PC/Mac/Linux)

    Read the article

  • opening iWorks documents in iPad UIWebView

    - by user369156
    Hello, I'm writing an iPad application that has a UIWebView which I open word and excel documents in, but I want the user to be able to import those documents into the iWorks applications, Pages and Numbers, just like how you can do it in Safari if you open a document. If you open a document in Safari on the iPad, there'll be a button on the top bar that says "Open in..." and you can choose applications to open in. You get the top bar to appear by tapping on middle of the page. So is there an option you can set to allow UIWebView to show up the bar and automatically detect the content type and populate the list with applications you can import in? Or do I have to build this myself? And if I have to build my own, how do I open URLs to import documents into Pages and Numbers etc? Thanks, -David

    Read the article

  • Are C/C++/ObjC/JS Apple's only allowed langauges for iPhone development?

    - by fbrereto
    According to this post on Daring Fireball a new iPhone SDK Agreement release in conjunction with the iPhone OS 4.0 announcement today specifically bans any iPhone application not implemented in C, C++ Objective-C or JavaScript. The clear impact here is to the wide array of programs written in languages other than those. Is that your reading of the clause in the new agreement as well? Update: Here is the clause as printed on Daring Fireball: 3.3.1 — Applications may only use Documented APIs in the manner prescribed by Apple and must not use or call any private APIs. Applications must be originally written in Objective-C, C, C++, or JavaScript as executed by the iPhone OS WebKit engine, and only code written in C, C++, and Objective-C may compile and directly link against the Documented APIs (e.g., Applications that link to Documented APIs through an intermediary translation or compatibility layer or tool are prohibited).

    Read the article

  • SSL_accept hangs... sometimes ( C, linux, openssl )

    - by zbigh
    I'm currently working on an embedded linux system. There are two crucial client applications on the system that connect to an external server ( on another embedded system, all written in C ). The two apps use different certificates. The ssl connection works... At least usually, but from time to time an error occures: the server hangs on SSL_accept() when accepting connection from one of the applications - the one using older certificates. Restarting the server application does not help, nor does restarting the client - the only way is to reboot the server system, unless I create a symbilic link to the new certificates used by the other app - only then will restarting the server app work. Never does the error occur when both applications use the same, new certificate. Could this happen due to some strange openssl cache or something like that?

    Read the article

< Previous Page | 239 240 241 242 243 244 245 246 247 248 249 250  | Next Page >