Search Results

Search found 25204 results on 1009 pages for 'event stream processing'.

Page 47/1009 | < Previous Page | 43 44 45 46 47 48 49 50 51 52 53 54  | Next Page >

  • jQuery: How to fire event when all asynchronous calls return?

    - by Jeremy
    I have a jQuery application that loads data from five asynchronous server calls. I do not want to display any data until all five calls return. (I plan on displaying a Loading message until that happens.) How can I detect when all five calls have returned? I considered having each callback method increment a variable (using jQuery's data() method, perhaps) and then waiting for the value to become 5. (I am not sure yet how I would listen for that event.) I do not think this is a very good solution, however. What would happen if two calls return at the same time? Is there a better way to do this?

    Read the article

  • Whats the best way/event to use when testing if the textbox text has finished a change of text

    - by Spooky2010
    using winforms, c#, vs 2008 So i have textbox1, textbox2 and textbox3 on a winforms. Textbox3.text = textbox1.text + textbox2.text. I need textbox3 to be updated whenever the contents of textbox1 and textbox2 have been changed either manually or programmatic. The problem is if i use textbox textchanged event it keeps firing as one types in the textbox. I need a way to call my method to fill textbox3 after either tb1 or tb2 have been FINISHED changing programmaticly or via key entry, and not fire everytime a letter of text is entered. How can I have TextBox3 update only when tb1 or tb2 have finished changing?

    Read the article

  • [C#] How to consume web service adheres to the Event-based Asynchronous Pattern?

    - by codemonkie
    I am following the example from http://msdn.microsoft.com/en-us/library/8wy069k1.aspx to consume a web service implemented (by 3rd party) using the Event-based Asynchronous Pattern. However, my program needs to do multiple calls to the DoStuffAsync() hence will get back as many DoStuffCompleted. I chose the overload which takes an extra parameter - Object userState to distinguish them. My first question is: Is it valid to cast a GUID to Object as below, where GUID is used to generate unique taskID? Object userState = Guid.NewGuid(); Secondly, do I need to spawn off a new thread for each DoStuffAsync() call, since I am calling it multiple times? Also, would be nice to have some online examples or tutorials on this subject. (I've been googling for it the whole day and didn't get much back) Many thanks

    Read the article

  • How to extract a Vorbis stream from a WAVE file?

    - by H.B.
    I would like to move the Vorbis stream into an ogg container but ffmpeg does not seem to recognize the stream. Even though MPlayer gives this output upon playback: Opening audio decoder: [acm] Win32/ACM decoders Loading codec DLL: 'vorbis.acm' Loaded DLL driver vorbis.acm at 10000000 Warning! ACM codec reports srcsize=0 AUDIO: 44100 Hz, 2 ch, s16le, 128.0 kbit/9.07% (ratio: 16000-176400) Selected audio codec: [vorbisacm] afm: acm (OggVorbis ACM) ffmpeg: ffmpeg -i Source.wav -acodec copy Target.ogg Input #0, wav, from 'Source.wav': Duration: 00:02:15.17, bitrate: 128 kb/s Stream #0.0: Audio: qg[0][0] / 0x6771, 44100 Hz, 2 channels, 128 kb/s [ogg @ 00000000003096C0] Unsupported codec id in stream 0 Output #0, ogg, to 'Target.ogg': Metadata: encoder : Lavf53.6.0 Stream #0.0: Audio: qg[0][0] / 0x6771, 44100 Hz, 2 channels, 128 kb/s Stream mapping: Stream #0.0 -> #0.0 Could not write header for output file #0 (incorrect codec parameters ?) Of course this does not necessarily need to be done via ffmpeg, any method that is workable would be fine... I have cut down one of the files to 512KB: sample.wav (Changed two chunk size fields in the wave header to account for this, the embedded stream is cut "without notice")

    Read the article

  • Asynchronous daemon processing / ORM interaction with Django

    - by perrierism
    I'm looking for a way to do asynchronous data processing with a daemon that uses Django ORM. However, the ORM isn't thread-safe; it's not thread-safe to try to retrieve / modify django objects from within threads. So I'm wondering what the correct way to achieve asynchrony is? Basically what I need to accomplish is taking a list of users in the db, querying a third party api and then making updates to user-profile rows for those users. As a daemon or background process. Doing this in series per user is easy, but it takes too long to be at all scalable. If the daemon is retrieving and updating the users through the ORM, how do I achieve processing 10-20 users at a time? I would use a standard threading / queue system for this but you can't thread interactions like models.User.objects.get(id=foo) ... Django itself is an asynchronous processing system which makes asynchronous ORM calls(?) for each request, so there should be a way to do it? I haven't found anything in the documentation so far. Cheers

    Read the article

  • -[NSCFData writeStreamHandleEvent:]: unrecognized selector sent to instance in a stream callback

    - by user295491
    Hi everyone, I am working with streams and sockets in iPhone SDK 3.1.3 the issue is when the program accept a callback and I want to handle this writestream callback the following error is triggered " Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: ' -[NSCFData writeStreamHandleEvent:]: unrecognized selector sent to instance 0x17bc70'" But I don't know how to solve it because everything seems fine. Even when I run the debugger there is no error the program works. Any hint here will help! The code of the callback is: void myWriteStreamCallBack (CFWriteStreamRef stream, CFStreamEventType eventType, void *info){ NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init]; Connection *handlerEv = [(Connection *)info retain] autorelease]; [handlerEv writeStreamHandleEvent:eventType]; [pool release]; } The code of the writeStreamHandleEvent: - (void)writeStreamHandleEvent:(CFStreamEventType) eventType{ switch(eventType) { case kCFStreamEventOpenCompleted: writeStreamOpen = YES; break; case kCFStreamEventCanAcceptBytes: NSLog(@"Writing in the stream"); [self writeOutgoingBufferToStream]; break; case kCFStreamEventErrorOccurred: error = CFWriteStreamGetError(writeStream); fprintf(stderr, "CFReadStreamGetError returned (%ld, %ld)\n", error.domain, error.error); CFWriteStreamUnscheduleFromRunLoop(writeStream, CFRunLoopGetCurrent(),kCFRunLoopCommonModes); CFWriteStreamClose(writeStream); CFRelease(writeStream); break; case kCFStreamEventEndEncountered: CFWriteStreamUnscheduleFromRunLoop(writeStream, CFRunLoopGetCurrent(),kCFRunLoopCommonModes); CFWriteStreamClose(writeStream); CFRelease(writeStream); break; } } The code of the stream configuration: CFSocketContext ctx = {0, self, nil, nil, nil}; CFWriteStreamSetClient (writeStream,registeredEvents, (CFWriteStreamClientCallBack)&myWriteStreamCallBack,(CFStreamClientContext *)(&ctx) ); CFWriteStreamScheduleWithRunLoop (writeStream, CFRunLoopGetCurrent(), kCFRunLoopDefaultMode); You can see that there is nothing strange!, well at least I don't see it. Thank you in advance.

    Read the article

  • Png image processing in .NET.

    - by Oybek
    I have the following task. Take a base image and overlay on it another one. The base image is 8b png as well as overlay. Here are the base (left) and overlay (right) images. Here is a result and how it must look. The picture in the left is a screenshot when one picture is on top of another (html and positioning) and the second is the result of programmatic merging. As you can see in the screenshot the borders of the text is darker. Also here are the sizes of the images Base image 14.9 KB Overlay image 6.87 KB Result image 34.8 KB The size of the resulting image is also huge Here is my code that I use to merge those pictures /*...*/ public Stream Concatinate(Stream baseStream, params Stream[] overlayStreams) { var @base = Image.FromStream(baseStream); var canvas = new Bitmap(@base.Width, @base.Height); using (var g = canvas.ToGraphics()) { g.DrawImage(@base, 0, 0); foreach (var item in overlayStreams) { using (var overlayImage = Image.FromStream(item)) { try { Overlay(@base, overlayImage, g); } catch { } } } } var ms = new MemoryStream(); canvas.Save(ms, ImageFormat.Png); canvas.Dispose(); @base.Dispose(); return ms; } /*...*/ /*Tograpics extension*/ public static Graphics ToGraphics(this Image image, CompositingQuality compositingQuality = CompositingQuality.HighQuality, SmoothingMode smoothingMode = SmoothingMode.HighQuality, InterpolationMode interpolationMode = InterpolationMode.HighQualityBicubic) { var g = Graphics.FromImage(image); g.CompositingQuality = compositingQuality; g.SmoothingMode = smoothingMode; g.InterpolationMode = interpolationMode; return g; } My questions are What should I do in order to merge images to achieve the result as in the screenshot? How can I lower the size of the result image? Is the System.Drawing a suitable tool for this or is there any better tool for working with png for .NET?

    Read the article

  • C# Process Binary File, Multi-Thread Processing

    - by washtik
    I have the following code that processes a binary file. I want to split the processing workload by using threads and assigning each line of the binary file to threads in the ThreadPool. Processing time for each line is only small but when dealing with files that might contain hundreds of lines, it makes sense to split the workload. My question is regarding the BinaryReader and thread safety. First of all, is what I am doing below acceptable. I have a feeling it would be better to pass only the binary for each line to the PROCESS_Binary_Return_lineData method. Please note the code below is conceptual. I looking for a but of guidance on this as my knowledge of multi-threading is in its infancy. Perhaps there is a better way to achieve the same result, i.e. split processing of each binary line. var dic = new Dictionary<DateTime, Data>(); var resetEvent = new ManualResetEvent(false); using (var b = new BinaryReader(File.Open(Constants.dataFile, FileMode.Open, FileAccess.Read, FileShare.Read))) { var lByte = b.BaseStream.Length; var toProcess = 0; while (lByte >= DATALENGTH) { b.BaseStream.Position = lByte; lByte = lByte - AB_DATALENGTH; ThreadPool.QueueUserWorkItem(delegate { Interlocked.Increment(ref toProcess); var lineData = PROCESS_Binary_Return_lineData(b); lock(dic) { if (!dic.ContainsKey(lineData.DateTime)) { dic.Add(lineData.DateTime, lineData); } } if (Interlocked.Decrement(ref toProcess) == 0) resetEvent.Set(); }, null); } } resetEvent.WaitOne();

    Read the article

  • Stream PDF to another local App

    - by Nathan
    Hi, I'm currently trying to optimize a small firefox extension that will grab a pdf off the current document and send it to a port that another local application is listening on. Right now it uses a terrifying hackjob of cache viewer. The way I'm getting it is loading the cache, searching through it using the current URL and grabbing the file and saving it to a temp directory. Then I stream the file in, delete the temp, and send it through the socket. Now, my new design, ideally I'd want to build it from scratch and cut out saving it to the local machine at all, and just stream it through the socket. I've been looking at doing something like, //check page to ensure its a pdf //init in/out streams //stream through sock //flush Now, this would be vastly superior to the 400 line hacked up mess I have now, but I'm new to building FF extensions, and after reading a lot about URIs and the file streaming and such I'm probably more confused than when I started trying to fix this three hours ago. I'm okay with sending things through the sockets and whatnot, I understand that, I'm mainly confused about what multitude of interfaces I want to use. Gah! Thanks! Also, long time reader, first time poster!

    Read the article

  • App-Engine Parse a UrlFetch UTF-8 encoded stream

    - by Davidrd91
    I am trying to parse an XML from a URL using the xml.sax parser. I know there are other libraries to use but coming from Java this is the one I am most familiar with and seems the least complicated to me. The code I'm using to parse is as follows: parser = xml.sax.make_parser() handler = MangaHandler() parser.setContentHandler(handler) url = urlfetch.Fetch('http://www.mangapanda.com/alphabetical', allow_truncated = False, follow_redirects = False, deadline = False) xml.sax.parseString(url.content, handler) This returns a SaxException (invalid token) once the parser reaches the first & sign: SAXParseException: <unknown>:582:34: not well-formed (invalid token) Because urlfetch returns a string and not a stream I cannot use the parse() (which only works with streams) and am left to use parseString() instead. To see if parsing as a stream would fix this I tried: parser.parse(io.StringIO(url.content).encode('utf-8')) but this returns: TypeError: initial_value must be unicode or None, not str I have also tried to use the urllib2 libraries which do return a stream instead of urlfetch but the file is too large and is automatically truncated, leaving me with missing data. Any Sort of work-around for this would be greatly appreciated as I've spent days getting around one obstacle just to be stopped by another.

    Read the article

  • How can unrealscript halt event handler execution after an arbitrary number of lines with no return or error?

    - by Dan Cowell
    I have created a class that extends TcpLink and is instantiated in a custom Kismet Sequence Action. It is being instantiated correctly and is making the GET HTTP request that I need it to (I have checked my access log in apache) and Apache is responding to the request with the appropriate content. The problem I have is that I'm using the event receive mode and it appears that somehow the handler for the Opened event is halted after a specific number of lines of code have executed. Here is my code for the Opened event: event Opened() { // A connection was established WorldInfo.Game.Broadcast(self, "[DNomad_TcpLinkClient] event opened"); WorldInfo.Game.Broadcast(self, "[DNomad_TcpLinkClient] Sending simple HTTP query"); //The HTTP GET request //char(13) and char(10) are carrage returns and new lines requesttext = "userId="$userId$"&apartmentId="$apartmentId; SendText("GET /"$path$"?"$requesttext$" HTTP/1.0"); SendText(chr(13)$chr(10)); SendText("Host: "$TargetHost); SendText(chr(13)$chr(10)); SendText("Connection: Close"); SendText(chr(13)$chr(10)$chr(13)$chr(10)); //WorldInfo.Game.Broadcast(self, "[DNomad_TcpLinkClient] Sent request: "$requesttext); //WorldInfo.Game.Broadcast(self, "[DNomad_TcpLinkClient] end HTTP query"); //WorldInfo.Game.Broadcast(self, "[DNomad_TcpLinkClient] LinkState: "$LinkState); //WorldInfo.Game.Broadcast(self, "[DNomad_TcpLinkClient] LinkMode: "$LinkMode); WorldInfo.Game.Broadcast(self, "[DNomad_TcpLinkClient] ReceiveMode: "$ReceiveMode); WorldInfo.Game.Broadcast(self, "[DNomad_TcpLinkClient] Error: "$string(GetLastError())); } As you can see, a number of the Broadcast calls have been commented out. Initially, only the lines up to the Broadcast containing "[DNomad_TcpLinkClient] Sent request: " were being executed and none of the Broadcasts were commented out. After commenting out that line, the next Broadcast was successful and so on and so forth. As a test, I commented out the very first Broadcast to see if the connection closing had any effect: // A connection was established //WorldInfo.Game.Broadcast(self, "[DNomad_TcpLinkClient] event opened"); WorldInfo.Game.Broadcast(self, "[DNomad_TcpLinkClient] Sending simple HTTP query"); Upon doing that, an additional Broadcast at the end of the function executed. Thus the inference that there is an upper limit to the number of lines executed. Additionally, my ReceivedText handler is never called, despite Apache returning the correct HTTP 200 response with a body. My working hypothesis is that somehow after the Sequence Action finishes executing the garbage collector cleans up the TcpLinkClient instance. My biggest source of confusion with that is how on earth it does it during the execution of an event handler. Has anyone ever seen anything like this before? My full TcpLinkClient class is below: /* * TcpLinkClient based on an example usage of the TcpLink class by Michiel 'elmuerte' Hendriks for Epic Games, Inc. * */ class DNomad_TcpLinkClient extends TcpLink; var PlayerController PC; var string TargetHost; var int TargetPort; var string path; var string requesttext; var string userId; var string apartmentId; var string statusCode; var string responseData; event PostBeginPlay() { super.PostBeginPlay(); } function DoTcpLinkRequest(string uid, string id) //removes having to send a host { userId = uid; apartmentId = id; Resolve(targethost); } function string GetStatus() { return statusCode; } event Resolved( IpAddr Addr ) { // The hostname was resolved succefully WorldInfo.Game.Broadcast(self, "[DNomad_TcpLinkClient] "$TargetHost$" resolved to "$ IpAddrToString(Addr)); // Make sure the correct remote port is set, resolving doesn't set // the port value of the IpAddr structure Addr.Port = TargetPort; //dont comment out this log because it rungs the function bindport WorldInfo.Game.Broadcast(self, "[DNomad_TcpLinkClient] Bound to port: "$ BindPort() ); if (!Open(Addr)) { WorldInfo.Game.Broadcast(self, "[DNomad_TcpLinkClient] Open failed"); } } event ResolveFailed() { WorldInfo.Game.Broadcast(self, "[TcpLinkClient] Unable to resolve "$TargetHost); // You could retry resolving here if you have an alternative // remote host. //send failed message to scaleform UI //JunHud(JunPlayerController(PC).myHUD).JunMovie.CallSetHTML("Failed"); } event Opened() { // A connection was established //WorldInfo.Game.Broadcast(self, "[DNomad_TcpLinkClient] event opened"); WorldInfo.Game.Broadcast(self, "[DNomad_TcpLinkClient] Sending simple HTTP query"); //The HTTP GET request //char(13) and char(10) are carrage returns and new lines requesttext = "userId="$userId$"&apartmentId="$apartmentId; SendText("GET /"$path$"?"$requesttext$" HTTP/1.0"); SendText(chr(13)$chr(10)); SendText("Host: "$TargetHost); SendText(chr(13)$chr(10)); SendText("Connection: Close"); SendText(chr(13)$chr(10)$chr(13)$chr(10)); //WorldInfo.Game.Broadcast(self, "[DNomad_TcpLinkClient] Sent request: "$requesttext); //WorldInfo.Game.Broadcast(self, "[DNomad_TcpLinkClient] end HTTP query"); //WorldInfo.Game.Broadcast(self, "[DNomad_TcpLinkClient] LinkState: "$LinkState); //WorldInfo.Game.Broadcast(self, "[DNomad_TcpLinkClient] LinkMode: "$LinkMode); WorldInfo.Game.Broadcast(self, "[DNomad_TcpLinkClient] ReceiveMode: "$ReceiveMode); WorldInfo.Game.Broadcast(self, "[DNomad_TcpLinkClient] Error: "$string(GetLastError())); } event Closed() { // In this case the remote client should have automatically closed // the connection, because we requested it in the HTTP request. WorldInfo.Game.Broadcast(self, "Connection closed."); // After the connection was closed we could establish a new // connection using the same TcpLink instance. } event ReceivedText( string Text ) { WorldInfo.Game.Broadcast(self, "Received Text: "$Text); //we dont want the header info, so we split the string after two new lines Text = Split(Text, chr(13)$chr(10)$chr(13)$chr(10), true); WorldInfo.Game.Broadcast(self, "Split Text: "$Text); statusCode = Text; } event ReceivedLine( string Line ) { WorldInfo.Game.Broadcast(self, "Received Line: "$Line); } event ReceivedBinary( int Count, byte B[255] ) { WorldInfo.Game.Broadcast(self, "Received Binary of length: "$Count); } defaultproperties { TargetHost="127.0.0.1" TargetPort=80 //default for HTTP LinkMode=MODE_Text ReceiveMode=RMODE_Event path = "dnomad/datafeed.php" userId = "0"; apartmentId = "0"; statusCode = ""; send = false; }

    Read the article

  • SQLAuthority News – Virtual Launch Event for Office 2010 – Contest – Win MS Office License

    - by pinaldave
    Office products are integral products of any PC. I accept that without Office Suites, I can not survive or make enough leaving. I am blogger and use word to create my blogs. I am SQL Server Trainer  and I use PowerPoint as my presentation tool. I am SQL Server consultant and I use Excel to keep my work log. I can not see my life with Office Tools. Just like any other Microsoft Product there is strong community following Office Tools. Please count me in. The same community is hosting a Virtual Launch Event for Office 2010 on May 25 and 26th. The webcasts is FREE to attend and people can take part either online or by going to the nearest available center. The sessions will be delivered by MVPs. To register please visit: http://www.meraoffice.com. In June, limited cities will be hosting Community Launch Events for Office 2010. At the launch events, attendees will get to see Office 2010 in action and learn how to do their work better with Office 2010.  The details are available on http://office.merawindows.com. To support one of the largest community, I am announcing one contents. It is very easy to take part in the contest. You just have to answer one very simple question. Contest: Choose best option: With which Microsoft Office Product Powerpivot is associated? Options: 1) PowerPoint 2) Excel 3) Word Hint: http://search.sqlauthority.com Rules: Winner will be awarded 1 Office 2007 Home and Student. This will be freely upgradeable to Office 2010 once it releases in June. The winners will be sent emails and they will redeem their awards via microsoftstore.co.in The prizes can only be shipped to India and Indian residents are eligible. Winner will be selected by selected community leaders and MVPs at their sole discretion. Winner will be informed by email about the award. Most creative and informative comment will win the contest. Please spread the words about this contest. SQLAuthority.com will also send SQL Server book to the person who generates the most traffic to this blog post using Twitter, Facebook and other social media. This competition is also open to Indian residents only. I will measure the traffic using my wordpress.com stats plugin. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority News, T SQL, Technology Tagged: Office

    Read the article

  • AJAX event, prevents other page actions

    - by cobaltduck
    Here's a fairly average scenario, using JSF as an example, but this same concept I have observed in ASP.NET, Apache Wicket, and other frameworks with ajax capabilities. <h:inputText id="text1" value="#{myBacker.myBean.myStringVar}" styleClass="goodCSS"> <f:ajax event="change" listener="#{myBacker.text1ChangeEventMethod}" update="someOtherField" /> </h:inputText> <h:selectBooleanCheckbox id="check1" value="#{myBacker.myBean.myBoolVar}" /> Let's suppose that the 'text1ChangeEventListener' is essential to 'someOtherField' and perhaps toggles its disabled attribute, or changes its available options, based on the value of 'myStringVar.' The particulars aren't important, let's just accept that for some reason we need an ajax call when the 'text1' value is changed. So Jane User is working her way down the form. She arrives at the 'text1' field and types some value. The cursor focus is still in the text field, as she moves her mouse to the 'check1' box and clicks. It appears to her that nothing has happened. She clicks again, and this time the checkbox highlights and the icon indicating a selection appears in the box. Jane has to do several entries in the form today, and sees this happen every time, and it becomes very frustrating for her. Likewise, Jeff Admin is also perusing this form, and begins to type in 'text1.' He then realizes he doesn't really want to enter this data, and so moves his mouse to the "cancel" button elsewhere on the page, and clicks. Nothing seems to happen. Jeff clicks again, and after confirming he really does want to cancel, is returned to the home page. Jeff scratches his head. The problem is simply that the first thing the system does after 'text1' looses focus is run the listener and perform the ajax operation. It may only take a fraction of a second, but still, you can click other buttons all you want, but until that ajax has finished, everything else is ignored. I've spent the morning searching and reading, and it seems no one else has even noticed this. I could find not one article, blog, past question here or at SO, or anyting that addresses this obvious and glaring deficiency in ajax. So first of all, am I truly alone in thinking this is a big problem? Second, does anyone have a solution?

    Read the article

  • UK Oracle User Group Event: Trends in Identity Management

    - by B Shashikumar
    As threat levels rise and new technologies such as cloud and mobile computing gain widespread acceptance, security is occupying more and more mindshare among IT executives. To help prepare for the rapidly changing security landscape, the Oracle UK User Group community and our partners at Enline/SENA have put together an User Group event in London on Apr 19 where you can learn more from your industry peers about upcoming trends in identity management. Here are some of the key trends in identity management and security that we predicted at the beginning of last year and look how they have turned out so far. You have to admit that we have a pretty good track record when it comes to forecasting trends in identity management and security. Threat levels will grow—and there will be more serious breaches:   We have since witnessed breaches of high value targets like RSA and Epsilon. Most organizations have not done enough to protect against insider threats. Organizations need to look for security solutions to stop user access to applications based on real-time patterns of fraud and for situations in which employees change roles or employment status within a company. Cloud computing will continue to grow—and require new security solutions: Cloud computing has since exploded into a dominant secular trend in the industry. Cloud computing continues to present many opportunities like low upfront costs, rapid deployment etc. But Cloud computing also increases policy fragmentation and reduces visibility and control. So organizations require solutions that bridge the security gap between the enterprise and cloud applications to reduce fragmentation and increase control. Mobile devices will challenge traditional security solutions: Since that time, we have witnessed proliferation of mobile devices—combined with increasing numbers of employees bringing their own devices to work (BYOD) — these trends continue to dissolve the traditional boundaries of the enterprise. This in turn, requires a holistic approach within an organization that combines strong authentication and fraud protection, externalization of entitlements, and centralized management across multiple applications—and open standards to make all that possible.  Security platforms will continue to converge: As organizations move increasingly toward vendor consolidation, security solutions are also evolving. Next-generation identity management platforms have best-of-breed features, and must also remain open and flexible to remain viable. As a result, developers need products such as the Oracle Access Management Suite in order to efficiently and reliably build identity and access management into applications—without requiring security experts. Organizations will increasingly pursue "business-centric compliance.": Privacy and security regulations have continued to increase. So businesses are increasingly look for solutions that combine strong security and compliance management tools with business ready experience for faster, lower-cost implementations.  If you'd like to hear more about the top trends in identity management and learn how to empower yourself, then join us for the Oracle UK User Group on Thu Apr 19 in London where Oracle and Enline/SENA product experts will come together to share security trends, best practices, and solutions for your business. Register Here.

    Read the article

  • Windows Server 2008 R2 Error CAPI2 Event ID 4107

    - by umar bhatti
    I am getting following error on couple of my 2008 R2 servers. I have tried couple of fixes which didn't fix the issues. Log Name: Application Source: Microsoft-Windows-CAPI2 Date: 18/03/2013 09:48:40 Event ID: 4107 Task Category: None Level: Error Keywords: Classic User: N/A Computer: ServerName Description: Failed extract of third-party root list from auto update cab at: <http://ctldl.windowsupdate.com/msdownload/update/v3/static/trustedr/en/authrootstl.cab> with error: The data is invalid. . Event Xml: <Event xmlns="http://schemas.microsoft.com/win/2004/08/events/event"> <System> <Provider Name="Microsoft-Windows-CAPI2" Guid="{5bbca4a8-b209-48dc-a8c7-b23d3e5216fb}" EventSourceName="Microsoft-Windows-CAPI2" /> <EventID Qualifiers="0">4107</EventID> <Version>0</Version> <Level>2</Level> <Task>0</Task> <Opcode>0</Opcode> <Keywords>0x8080000000000000</Keywords> <TimeCreated SystemTime="2013-03-18T09:48:40.169581600Z" /> <EventRecordID>8713</EventRecordID> <Correlation /> <Execution ProcessID="412" ThreadID="5288" /> <Channel>Application</Channel> <Computer>ServerName <Security /> </System> <EventData> <Data>http://ctldl.windowsupdate.com/msdownload/update/v3/static/trustedr/en/authrootstl.cab</Data> <Data>The data is invalid. </Data> </EventData> </Event>

    Read the article

  • overloaded stream insertion operator with a vector

    - by Julz
    hi, i'm trying to write an overloaded stream insertion operator for a class who's only member is a vector. i dont really know what i'm doing. (lets make that clear) it's a vector of "Points" which is a struct containing two doubles. i figure what i want is to insert user input (a bunch of doubles) into a stream that i then send to a modifier method? i keep working off other stream insertion examples such as... std::ostream& operator<< (std::ostream& o, Fred const& fred) { return o << fred.i_; } but when i try a similar..... istream & operator >> (istream &inStream, Polygon &vertStr) { inStream >> ws; inStream >> vertStr.vertices; return inStream; } i get an error "no match for operator etc etc. if i leave off the .vertices it compiles but i figure it's not right? (vertices is the name of my vector ) and even if it is right, i dont actually know what syntax to use in my driver to use it? also not %100 on what my modifier method needs to look like. here's my Polygon class //header #ifndef POLYGON_H #define POLYGON_H #include "Segment.h" #include <vector> class Polygon { friend std::istream & operator >> (std::istream &inStream, Polygon &vertStr); public: //Constructor Polygon(const Point &theVerts); //Default Constructor Polygon(); //Copy Constructor Polygon(const Polygon &polyCopy); //Accessor/Modifier methods inline std::vector<Point> getVector() const {return vertices;} //Return number of Vector elements inline int sizeOfVect() const {return (int) vertices.capacity();} //add Point elements to vector inline void setVertices(const Point &theVerts){vertices.push_back (theVerts);} private: std::vector<Point> vertices; }; #endif //Body using namespace std; #include "Polygon.h" // Constructor Polygon::Polygon(const Point &theVerts) { vertices.push_back (theVerts); } //Copy Constructor Polygon::Polygon(const Polygon &polyCopy) { vertices = polyCopy.vertices; } //Default Constructor Polygon::Polygon(){} istream & operator >> (istream &inStream, Polygon &vertStr) { inStream >> ws; inStream >> vertStr; return inStream; } any help greatly appreciated, sorry to be so vague, a lecturer has just kind of given us a brief example of stream insertion then left us on our own thanks. oh i realise there are probably many other problems that need fixing

    Read the article

  • Dynamically add event to custom control (Confirm Message Box)

    - by Nyein Nyein Chan Chan
    I have created a custom cofirm message box control and I created an event like this- [Category("Action")] [Description("Raised when the user clicks the button(ok)")] public event EventHandler Submit; protected virtual void OnSubmit(EventArgs e) { if (Submit != null) Submit(this, e); } The Event OnSubmit occurs when user click the OK button on the Confrim Box. void IPostBackEventHandler.RaisePostBackEvent(string eventArgument) { OnSubmit(e); } Now I am adding this OnSubmit Event Dynamically like this- In aspx- <my:ConfirmMessageBox ID="cfmTest" runat="server" ></my:ConfirmMessageBox> <asp:Button ID="btnCallMsg" runat="server" onclick="btnCallMsg_Click" /> <asp:TextBox ID="txtResult" runat="server" ></asp:TextBox> In cs- protected void btnCallMsg_Click(object sender, EventArgs e) { cfmTest.Submit += cfmTest_Submit;//Dynamically Add Event cfmTest.ShowConfirm("Are you sure to Save Data?"); //Show Confirm Message using Custom Control Message Box } protected void cfmTest_Submit(object sender, EventArgs e) { txtResult.Text = "User Confirmed";//I set the text to "User Confrimed" but it's not displayed txtResult.Focus();//I focus the textbox but I got Error } The Error I got is- System.InvalidOperationException was unhandled by user code Message="SetFocus can only be called before and during PreRender." Source="System.Web" So, when I dynamically add and fire custom control's event, there is an error in Web Control. If I add event in aspx file like this, <my:ConfirmMessageBox ID="cfmTest" runat="server" OnSubmit="cfmTest_Submit"></my:ConfirmMessageBox> There is no error and work fine. Can anybody help me to add event dynamically to custom control? Thanks.

    Read the article

  • Lazarus - why I can't assign event to run-time component?

    - by Garret Raziel
    Hello, I have this Lazarus program: unit Unit2; {$mode objfpc}{$H+} interface uses Classes, SysUtils, FileUtil, LResources, Forms, Controls, Graphics, Dialogs, StdCtrls, ComCtrls; type { TForm2 } TForm2 = class(TForm) procedure OnTlacitkoClick(Sender: TObject); procedure FormClose(Sender: TObject; var CloseAction: TCloseAction); procedure FormCreate(Sender: TObject); tlac:TButton; private { private declarations } public { public declarations } end; var Form2: TForm2; implementation { TForm2 } procedure TForm2.OnTlacitkoClick(Sender: TObject); begin showmessage('helloworld'); end; procedure TForm2.FormCreate(Sender: TObject); var i,j:integer; begin tlac:=TButton.Create(Form2); tlac.OnClick:=OnTlacitkoClick; tlac.Parent:=Form2; tlac.Left:=100; tlac.Top:=100; end; initialization {$I unit2.lrs} end. but compiler says: unit2.pas(63,32) Error: Wrong number of parameters specified for call to "OnTlacitkoClick" in tlac.OnClick:=OnTlacitkoClick; expression. I have searched and think that this is legal expression in Delphi. I want simply to registrate OnTlacitkoClick as tlac.OnClick event,not to call this procedure. Is there somethink wrong with the code or must I do it different in Lazarus/Freepascal? Thanks.

    Read the article

  • NTDS Replication Warning (Event ID 2089)

    - by Chris_K
    I have a simple little network with 3 AD servers in 2 sites. Site A has Win2k3 SP2 and Win2k SP4 servers, site B has a single Win2k3 SP2 server. All have been in place for at least 3 years now. Just last week I started getting Event 2089 "not backed up" warnings (example below) on both of the win2k3 servers. I understand what the message means, no need to send me links to the technet article explaining it. I'll improve my backups. What I'm more curious about is why did I just start getting this message now? Why haven't I been getting it for the past 3 years?!? Perhaps this is related: I recently decommissioned a few other sites and AD controllers (there used to be 3 more sites, each with their own controller). Don't worry, I did proper DCpromo exercises and made sure we didn't lose anything. But would shutting those down possibly be related to why I get this error now? This won't keep me awake at night but I am curious as to what changed... Event Type: Warning Event Source: NTDS Replication Event Category: Backup Event ID: 2089 Date: 3/28/2010 Time: 9:25:27 AM User: NT AUTHORITY\ANONYMOUS LOGON Computer: RedactedName Description: This directory partition has not been backed up since at least the following number of days. Directory partition: DC=MyDomain,DC=com 'Backup latency interval' (days): 30 It is recommended that you take a backup as often as possible to recover from accidental loss of data. However if you haven't taken a backup since at least the 'backup latency interval' number of days, this message will be logged every day until a backup is taken. You can take a backup of any replica that holds this partition. By default the 'Backup latency interval' is set to half the 'Tombstone Lifetime Interval'. If you want to change the default 'Backup latency interval', you could do so by adding the following registry key. 'Backup latency interval' (days) registry key: System\CurrentControlSet\Services\NTDS\Parameters\Backup Latency Threshold (days) For more information, see Help and Support Center at http://go.microsoft.com/fwlink/events.asp.

    Read the article

  • overloaded stream insetion operator with a vector

    - by julz666
    hi, i'm trying to write an overloaded stream insertion operator for a class who's only member is a vector. i dont really know what i'm doing. (lets make that clear) it's a vector of "Points" which is a struct containing two doubles. i figure what i want is to insert user input (a bunch of doubles) into a stream that i then send to a modifier method? i keep working off other stream insertion examples such as... std::ostream& operator<< (std::ostream& o, Fred const& fred) { return o << fred.i_; } but when i try a similar..... istream & operator >> (istream &inStream, Polygon &vertStr) { inStream >> ws; inStream >> vertStr.vertices; return inStream; } i get an error "no match for operator etc etc. if i leave off the .vertices it compiles but i figure it's not right? (vertices is the name of my vector ) and even if it is right, i dont actually know what syntax to use in my driver to use it? also not %100 on what my modifier method needs to look like. here's my Polygon class //header #ifndef POLYGON_H #define POLYGON_H #include "Segment.h" #include <vector> class Polygon { friend std::istream & operator >> (std::istream &inStream, Polygon &vertStr); public: //Constructor Polygon(const Point &theVerts); //Default Constructor Polygon(); //Copy Constructor Polygon(const Polygon &polyCopy); //Accessor/Modifier methods inline std::vector<Point> getVector() const {return vertices;} //Return number of Vector elements inline int sizeOfVect() const {return (int) vertices.capacity();} //add Point elements to vector inline void setVertices(const Point &theVerts){vertices.push_back (theVerts);} private: std::vector<Point> vertices; }; #endif //Body using namespace std; #include "Polygon.h" // Constructor Polygon::Polygon(const Point &theVerts) { vertices.push_back (theVerts); } //Copy Constructor Polygon::Polygon(const Polygon &polyCopy) { vertices = polyCopy.vertices; } //Default Constructor Polygon::Polygon(){} istream & operator >> (istream &inStream, Polygon &vertStr) { inStream >> ws; inStream >> vertStr; return inStream; } any help greatly appreciated, sorry to be so vague, a lecturer has just kind of given us a brief example of stream insertion then left us on our own thanks. oh i realise there are probably many other problems that need fixing

    Read the article

  • How to encrypt and save a binary stream after serialization and read it back?

    - by Anindya Chatterjee
    I am having some problems in using CryptoStream when I want to encrypt a binary stream after binary serialization and save it to a file. I am getting the following exception System.ArgumentException : Stream was not readable. Can anybody please show me how to encrypt a binary stream and save it to a file and deserialize it back correctly? The code is as follows: class Program { public static void Main(string[] args) { var b = new B {Name = "BB"}; WriteFile<B>(@"C:\test.bin", b, true); var bb = ReadFile<B>(@"C:\test.bin", true); Console.WriteLine(b.Name == bb.Name); Console.ReadLine(); } public static T ReadFile<T>(string file, bool decrypt) { T bObj = default(T); var _binaryFormatter = new BinaryFormatter(); Stream buffer = null; using (var stream = new FileStream(file, FileMode.OpenOrCreate)) { if(decrypt) { const string strEncrypt = "*#4$%^.++q~!cfr0(_!#$@$!&#&#*&@(7cy9rn8r265&$@&*E^184t44tq2cr9o3r6329"; byte[] dv = {0x12, 0x34, 0x56, 0x78, 0x90, 0xAB, 0xCD, 0xEF}; CryptoStream cs; DESCryptoServiceProvider des = null; var byKey = Encoding.UTF8.GetBytes(strEncrypt.Substring(0, 8)); using (des = new DESCryptoServiceProvider()) { cs = new CryptoStream(stream, des.CreateEncryptor(byKey, dv), CryptoStreamMode.Read); } buffer = cs; } else buffer = stream; try { bObj = (T) _binaryFormatter.Deserialize(buffer); } catch(SerializationException ex) { Console.WriteLine(ex.Message); } } return bObj; } public static void WriteFile<T>(string file, T bObj, bool encrypt) { var _binaryFormatter = new BinaryFormatter(); Stream buffer; using (var stream = new FileStream(file, FileMode.Create)) { try { if(encrypt) { const string strEncrypt = "*#4$%^.++q~!cfr0(_!#$@$!&#&#*&@(7cy9rn8r265&$@&*E^184t44tq2cr9o3r6329"; byte[] dv = {0x12, 0x34, 0x56, 0x78, 0x90, 0xAB, 0xCD, 0xEF}; CryptoStream cs; DESCryptoServiceProvider des = null; var byKey = Encoding.UTF8.GetBytes(strEncrypt.Substring(0, 8)); using (des = new DESCryptoServiceProvider()) { cs = new CryptoStream(stream, des.CreateEncryptor(byKey, dv), CryptoStreamMode.Write); buffer = cs; } } else buffer = stream; _binaryFormatter.Serialize(buffer, bObj); buffer.Flush(); } catch(SerializationException ex) { Console.WriteLine(ex.Message); } } } } [Serializable] public class B { public string Name {get; set;} } It throws the serialization exception as follows The input stream is not a valid binary format. The starting contents (in bytes) are: 3F-17-2E-20-80-56-A3-2A-46-63-22-C4-49-56-22-B4-DA ...

    Read the article

  • DegradedArray event on /dev/md0 without actually having a RAID

    - by J. Stoever
    Since I upgraded from Ubuntu LTS 10 to LTS 12, I have been getting error messages like: N 60 mdadm monitoring Mon Sep 3 06:38 31/1022 DegradedArray event on /dev/md2:Ubuntu-1004-lucid-64-minimal N 61 mdadm monitoring Mon Sep 3 06:38 31/1022 DegradedArray event on /dev/md0:Ubuntu-1004-lucid-64-minimal N 62 mdadm monitoring Mon Sep 3 06:38 31/1022 DegradedArray event on /dev/md1:Ubuntu-1004-lucid-64-minimal We do not have a RAID setup, and only have a single hard drive. Ideas ?

    Read the article

  • Disabling repeating keyboard down event in as3

    - by psy-sci
    now I'm trying to make the keyboard events to stop repeating. My idea was to have a true and false condition for when the key is pressed so that it wont repeat if the key is down already. //Mouse Event Over keyCButton.addEventListener(MouseEvent.MOUSE_OVER, function(){gotoAndStop(2)}); //Variable var Qkey:uint = 81; //Key Down Event stage.addEventListener(KeyboardEvent.KEY_DOWN, keydown); var soundplayed = false; function keydown(event:KeyboardEvent){ if (event.keyCode==Qkey) { this.soundplayed=true;} } if (this.soundplayed==false){ gotoAndPlay(3); } else {} //Key Up Event stage.addEventListener(KeyboardEvent.KEY_UP, keyup); function keyup(event:KeyboardEvent){ if (event.keyCode==Qkey) { this.soundplayed=true; gotoAndStop(1); } } doing this just turns off the key event I think i need to add a "&& keyDown..." to "if (this.soundplayed==true)" but i dont know how to do it without getting errors here is the keyboard player i'm trying to fix http://soulseekrecords.org/psysci/animation/piano.html

    Read the article

  • Distributed and/or Parallel SSIS processing

    - by Jeff
    Background: Our company hosts SaaS DSS applications, where clients provide us data Daily and/or Weekly, which we process & merge into their existing database. During business hours, load in the servers are pretty minimal as it's mostly users running simple pre-defined queries via the website, or running drill-through reports that mostly hit the SSAS OLAP cube. I manage the IT Operations Team, and so far this has presented an interesting "scaling" issue for us. For our daily-refreshed clients, the server is only "busy" for about 4-6 hrs at night. For our weekly-refresh clients, the server is only "busy" for maybe 8-10 hrs per week! We've done our best to use some simple methods of distributing the load by spreading the daily clients evenly among the servers such that we're not trying to process daily clients back-to-back over night. But long-term this scaling strategy creates two notable issues. First, it's going to consume a pretty immense amount of hardware that sits idle for large periods of time. Second, it takes significant Production Support over-head to basically "schedule" the ETL such that they don't over-lap, and move clients/schedules around if they out-grow the resources on a particular server or allocated time-slot. As the title would imply, one option we've tried is running multiple SSIS packages in parallel, but in most cases this has yielded VERY inconsistent results. The most common failures are DTExec, SQL, and SSAS fighting for physical memory and throwing out-of-memory errors, and ETLs running 3,4,5x longer than expected. So from my practical experience thus far, it seems like running multiple ETL packages on the same hardware isn't a good idea, but I can't be the first person that doesn't want to scale multiple ETLs around manual scheduling, and sequential processing. One option we've considered is virtualizing the servers, which obviously doesn't give you any additional resources, but moves the resource contention onto the hypervisor, which (from my experience) seems to manage simultaneous CPU/RAM/Disk I/O a little more gracefully than letting DTExec, SQL, and SSAS battle it out within Windows. Question to the forum: So my question to the forum is, are we missing something obvious here? Are there tools out there that can help manage running multiple SSIS packages on the same hardware? Would it be more "efficient" in terms of parallel execution if instead of running DTExec, SQL, and SSAS same machine (with every machine running that configuration), we run in pairs of three machines with SSIS running on one machine, SQL on another, and SSAS on a third? Obviously that would only make sense if we could process more than the three ETL we were able to process on the machine independently. Another option we've considered is completely re-architecting our SSIS package to have one "master" package for all clients that attempts to intelligently chose a server based off how "busy" it already is in terms of CPU/Memory/Disk utilization, but that would be a herculean effort, and seems like we're trying to reinvent something that you would think someone would sell (although I haven't had any luck finding it). So in summary, are we missing an obvious solution for this, and does anyone know if any tools (for free or for purchase, doesn't matter) that facilitate running multiple SSIS ETL packages in parallel and on multiple servers? (What I would call a "queue & node based" system, but that's not an official term). Ultimately VMWare's Distributed Resource Scheduler addresses this as you simply run a consistent number of clients per VM that you know will never conflict scheduleing-wise, then leave it up to VMWare to move the VMs around to balance out hardware usage. I'm definitely not against using VMWare to do this, but since we're a 100% Microsoft app stack, it seems like -someone- out there would have solved this problem at the application layer instead of the hypervisor layer by checking on resource utilization at the OS, SQL, SSAS levels. I'm open to ANY discussion on this, and remember no suggestion is too crazy or radical! :-) Right now, VMWare is the only option we've found to get away from "manually" balancing our resources, so any suggestions that leave us on a pure Microsoft stack would be great. Thanks guys, Jeff

    Read the article

  • Php JSON Response Array

    - by Nick Kl
    I have this php code. As you can see i query a mysql database through a function showallevents. I return a the $result to the $event variable. I try to return all rows of the data that i take with the msql_fetch_assoc. I don't get response even when i encode the $response variable. It returns null to all fields. Can anyone help me on what i am doing wrong. I had a valid code but it was returning only 1 row of data so i tried to make an associative array but seems i am failing. if ($tag == 'showallevents') { // Request type is show all events // show all events $event = $db->showallevents(); if ($event != false) { $data = array(); while($row = mysql_fetch_assoc($event)) { $data[] = array( $response["success"] = 1, $response["uid"] = $event["uid"], $response["event"]["date"] = $event["date"], $response["event"]["hours"] = $event["hours"], $response["event"]["store_name"] = $event["store_name"], $response["event"]["event_information"] = $event["event_information"], $response["event"]["event_type"] = $event["event_type"], $response["event"]["Phone"] = $event["Phone"], $response["event"]["address"] = $event["address"], $response["event"]["created_at"] = $event["created_at"], $response["event"]["updated_at"] = $event["updated_at"]); } echo json_encode($data); } else { // event not found // echo json with error = 1 $response["error"] = 1; $response["error_msg"] = "Events not found"; echo json_encode($response); } } else { echo "Access Denied"; } } ?>

    Read the article

< Previous Page | 43 44 45 46 47 48 49 50 51 52 53 54  | Next Page >