Search Results

Search found 2905 results on 117 pages for 'ad hoc'.

Page 109/117 | < Previous Page | 105 106 107 108 109 110 111 112 113 114 115 116  | Next Page >

  • How to move MOSS 2007 to another SharePoint Farm

    - by DipeshBhanani
    It was time of my first onsite client assignment on SharePoint. Client had one server production environment. They wanted to upgrade the topology with completely new SharePoint Farm of three servers. So, the task was to move whole MOSS 2007 stuff to the new server environment without impacting data. The last three words “… without impacting data…” were actually putting pressure on my head. Moreover SSP was required to move because additional information has been added for users apart from AD import.   I thought I had to do only backup and restore. It appeared pretty easy at first thought. Just because of these three damn scary words, I thought to check out on internet for guidance related to this scenario. I couldn’t get anything except general guidance of moving server on Microsoft TechNet site. I promised myself for starting blogs with this post if I would be successful in this task. Well, I took long time to write this but finally made it. I hope it will be useful to all guys looking for SharePoint server movement.   Before beginning restoration, make sure that, there is no difference in versions of SharePoint at source and destination server. Also check whether the state of SharePoint Installation at the time of backup and restore is same or not. (E.g. SharePoint related service packs and patches if any)   The main tasks of the server movement are as follow:   1.        Backup all the databases 2.        Install and configure SharePoint on new environment 3.        Deploy all solutions (WSP Files) globally to destination server- for installing features attached to the solutions 4.        Install all the custom features 5.        Deploy/Copy custom pages/files which are added to the “12Hive” folder later 6.        Restore SSP 7.        Restore My Site 8.        Restore other web application   Tasks 3 to 5 are for making sure that we have configured the environment well enough for the web application to be restored successfully. The main and complex task was restoring SSP. I have started restoring SSP through Central Admin. After a while, the restoration status was updated to “unsuccessful”. “Damn it, what went wrong?” I thought looking at the error detail down the page. I couldn’t remember the error message but I had corrected and restored it again.   Actually once you fail restoring SSP, until and unless you don’t clean all related stuff well, your restoration will be failed again and again. I wanted to find the actual reason. So cleaned, restored, cleaned, restored… I had tried almost 5-6 times and finally, I succeeded. I had realized how pleasant it is, to see the word “Successful” on the screen. Without wasting your much time to read, let me write all the detailed steps of restoring SSP:   1.        Delete the SSP through following STSADM command. stsadm -o deletessp -title <SSP name> -deletedatabases -force e.g.: stsadm -o deletessp -title SharedServices1 -deletedatabases –force 2.        Check and delete the web application associated with SSP if it exists. 3.        Remove Link from Check and remove “Alternate Access Mapping” associated with SSP if it exists. 4.        Check and delete IIS site as well as application pool associated with SSP if it exists. 5.        Stop following services: ·         Office SharePoint Server Search ·         Windows SharePoint Services Search ·         Windows SharePoint Services Help Search   6.        Delete all the databases associated/related to SSP from SQL Server. 7.        Reset IIS. 8.        Start again following services: ·         Office SharePoint Server Search ·         Windows SharePoint Services Search ·         Windows SharePoint Services Help Search   9.        Restore the new SSP.   After the SSP restoration, all other stuffs had completed very smoothly without any more issues. I did few modifications to sites for change of server name and finally, the new environment was ready.

    Read the article

  • How Mature is Your Database Change Management Process?

    - by Ben Rees
    .dbd-banner p{ font-size:0.75em; padding:0 0 10px; margin:0 } .dbd-banner p span{ color:#675C6D; } .dbd-banner p:last-child{ padding:0; } @media ALL and (max-width:640px){ .dbd-banner{ background:#f0f0f0; padding:5px; color:#333; margin-top: 5px; } } -- Database Delivery Patterns & Practices Further Reading Organization and team processes How do you get your database schema changes live, on to your production system? As your team of developers and DBAs are working on the changes to the database to support your business-critical applications, how do these updates wend their way through from dev environments, possibly to QA, hopefully through pre-production and eventually to production in a controlled, reliable and repeatable way? In this article, I describe a model we use to try and understand the different stages that customers go through as their database change management processes mature, from the very basic and manual, through to advanced continuous delivery practices. I also provide a simple chart that will help you determine “How mature is our database change management process?” This process of managing changes to the database – which all of us who have worked in application/database development have had to deal with in one form or another – is sometimes known as Database Change Management (even if we’ve never used the term ourselves). And it’s a difficult process, often painfully so. Some developers take the approach of “I’ve no idea how my changes get live – I just write the stored procedures and add columns to the tables. It’s someone else’s problem to get this stuff live. I think we’ve got a DBA somewhere who deals with it – I don’t know, I’ve never met him/her”. I know I used to work that way. I worked that way because I assumed that making the updates to production was a trivial task – how hard can it be? Pause the application for half an hour in the middle of the night, copy over the changes to the app and the database, and switch it back on again? Voila! But somehow it never seemed that easy. And it certainly was never that easy for database changes. Why? Because you can’t just overwrite the old database with the new version. Databases have a state – more specifically 4Tb of critical data built up over the last 12 years of running your business, and if your quick hotfix happened to accidentally delete that 4Tb of data, then you’re “Looking for a new role” pretty quickly after the failed release. There are a lot of other reasons why a managed database change management process is important for organisations, besides job security, not least: Frequency of releases. Many business managers are feeling the pressure to get functionality out to their users sooner, quicker and more reliably. The new book (which I highly recommend) Lean Enterprise by Jez Humble, Barry O’Reilly and Joanne Molesky provides a great discussion on how many enterprises are having to move towards a leaner, more frequent release cycle to maintain their competitive advantage. It’s no longer acceptable to release once per year, leaving your customers waiting all year for changes they desperately need (and expect) Auditing and compliance. SOX, HIPAA and other compliance frameworks have demanded that companies implement proper processes for managing changes to their databases, whether managing schema changes, making sure that the data itself is being looked after correctly or other mechanisms that provide an audit trail of changes. We’ve found, at Red Gate that we have a very wide range of customers using every possible form of database change management imaginable. Everything from “Nothing – I just fix the schema on production from my laptop when things go wrong, and write it down in my notebook” to “A full Continuous Delivery process – any change made by a dev gets checked in and recorded, fully tested (including performance tests) before a (tested) release is made available to our Release Management system, ready for live deployment!”. And everything in between of course. Because of the vast number of customers using so many different approaches we found ourselves struggling to keep on top of what everyone was doing – struggling to identify patterns in customers’ behavior. This is useful for us, because we want to try and fit the products we have to different needs – different products are relevant to different customers and we waste everyone’s time (most notably, our customers’) if we’re suggesting products that aren’t appropriate for them. If someone visited a sports store, looking to embark on a new fitness program, and the store assistant suggested the latest $10,000 multi-gym, complete with multiple weights mechanisms, dumb-bells, pull-up bars and so on, then he’s likely to lose that customer. All he needed was a pair of running shoes! To solve this issue – in an attempt to simplify how we understand our customers and our offerings – we built a model. This is a an attempt at trying to classify our customers in to some sort of model or “Customer Maturity Framework” as we rather grandly term it, which somehow simplifies our understanding of what our customers are doing. The great statistician, George Box (amongst other things, the “Box” in the Box-Jenkins time series model) gave us the famous quote: “Essentially all models are wrong, but some are useful” We’ve taken this quote to heart – we know it’s a gross over-simplification of the real world of how users work with complex legacy and new database developments. Almost nobody precisely fits in to one of our categories. But we hope it’s useful and interesting. There are actually a number of similar models that exist for more general application delivery. We’ve found these from ThoughtWorks/Forrester, from InfoQ and others, and initially we tried just taking these models and replacing the word “application” for “database”. However, we hit a problem. From talking to our customers we know that users are far less further down the road of mature database change management than they are for application development. As a simple example, no application developer, who wants to keep his/her job would develop an application for an organisation without source controlling that code. Sure, he/she might not be using an advanced Gitflow branching methodology but they’ll certainly be making sure their code gets managed in a repo somewhere with all the benefits of history, auditing and so on. But this certainly isn’t the case (yet) for the database – a very large segment of the people we speak to have no source control set up for their databases whatsoever, even at the most basic level (for example, keeping change scripts in a source control system somewhere). By the way, if this is you, Red Gate has a great whitepaper here, on the barriers people face getting a source control process implemented at their organisations. This difference in maturity is the same as you move in to areas such as continuous integration (common amongst app developers, relatively rare for database developers) and automated release management (growing amongst app developers, very rare for the database). So, when we created the model we started from scratch and biased the levels of maturity towards what we actually see amongst our customers. But, what are these stages? And what level are you? The table below describes our definitions for four levels of maturity – Baseline, Beginner, Intermediate and Advanced. As I say, this is a model – you won’t fit any of these categories perfectly, but hopefully one will ring true more than others. We’ve also created a PDF with a flow chart to help you find which of these groups most closely matches your team:  Download the Database Delivery Maturity Framework PDF here   Level D1 – Baseline Work directly on live databases Sometimes work directly in production Generate manual scripts for releases. Sometimes use a product like SQL Compare or similar to do this Any tests that we might have are run manually Level D2 – Beginner Have some ad-hoc DB version control such as manually adding upgrade scripts to a version control system Attempt is made to keep production in sync with development environments There is some documentation and planning of manual deployments Some basic automated DB testing in process Level D3 – Intermediate The database is fully version-controlled with a product like Red Gate SQL Source Control or SSDT Database environments are managed Production environment schema is reproducible from the source control system There are some automated tests Have looked at using migration scripts for difficult database refactoring cases Level D4 – Advanced Using continuous integration for database changes Build, testing and deployment of DB changes carried out through a proper database release process Fully automated tests Production system is monitored for fast feedback to developers   Does this model reflect your team at all? Where are you on this journey? We’d be very interested in knowing how you get on. We’re doing a lot of work at the moment, at Red Gate, trying to help people progress through these stages. For example, if you’re currently not source controlling your database, then this is a natural next step. If you are already source controlling your database, what about the next stage – continuous integration and automated release management? To help understand these issues, there’s a summary of the Red Gate Database Delivery learning program on our site, alongside a Patterns and Practices library here on Simple-Talk and a Training Academy section on our documentation site to help you get up and running with the tools you need to progress. All feedback is welcome and it would be great to hear where you find yourself on this journey! This article is part of our database delivery patterns & practices series on Simple Talk. Find more articles for version control, automated testing, continuous integration & deployment.

    Read the article

  • The Apple iPad &ndash; I&rsquo;m gonna get it!

    - by Sahil Malik
    Ad:: SharePoint 2007 Training in .NET 3.5 technologies (more information). Well, heck, here comes another non-techie blogpost. You know I’m a geek, so I love gadgets! I found it RATHER interesting to see all the negative news on the blogosphere about the iPad. The main bitch points are - No Multi tasking No Flash Just a bigger iPhone. So here’s the deal! My view is, the above 3 are EXACTLY what I had personally hoped for in the Apple iPad. Before the release, I had gone on the record saying - “If the Apple Tablet is able to run full fledged iTunes (so I can get rid of iTunes on my desktop, I don’t like iTunes on Windows), can browse the net, can read PDFs, and will be under $1000, I’ll buy it”. Well, so, the released iPad wasn’t exactly like my dream tablet. The biggest downer IMO was it’s inability to run full iTunes. But, really, in retrospect, I like the newly released iPad. And here is why. No Multi tasking and No Flash, means much better battery life. Frankly, I rarely multi task on my laptop/desktop .. yeah I know my OS does .. but ME – I don’t multi task, and I don’t think you do either!! As I type this blogpost, I have a few windows running behind the scenes, but they are simply waiting for me to get back to them. The only thing truly running and I am making use of, other than this blogost, is media player playing some music – which the iPad can do. Also, I am logged into IM/Email – which again, iPad can do via notifications. It does the limited multitasking I need, without chewing down on batteries. Smart thinking, precisely the reason I love the iPhone. I don’t want a bulky battery consuming machine. Lack of flash? Okay sure, I can’t see Hulu on my iPad. That’s some loss. I can see youtube. Also, per Adobe I can’t see some porn sites, which I don’t want to see on my iPad. But, Flash is heavy. Especially flash video. My dream is to see silverlight run on the iPhone and iPad. No flash = not such a big loss. Speaking of battery life – 10 hours is plenty. I haven’t been away from electricity for that long usually, so I’m okay with charging it up when it runs low. It’s really not such a big deal honestly. Finally, eBook functionality – wow! I went on the record saying, eBook readers are not for me, but seriously, the iPad is perfect for my eBook needs at least. And as far it being just a bigger iPhone? I’ve always wanted a bigger iPhone, precisely for the eBook reading experience. I love my iPhone, I love the apps on it. The only thing that sucks about the iPhone is battery life, but other than that, it is the best gadget I have ever bought! And something that runs on mobile chips, is that thin, and those newly written apps .. mail, calendar .. I am very very excited to get my iPad, which will be the 64gig 3G version. The biggest plus in an iPad ……… no contract on data. I am *hoping*, this means that I can buy a SIM card in Europe, and use the iPad here. That would be killer awesome! But hey, if I had to pick downers in the iPad, they would be - - I wish they had a 128G Version. Now that we have a good video viewing machine, I know I’d chew up space quickly.- Sync over WIFI, seriously Apple.  Both for iPhone and iPad.- 3 month wait!!- Existing iPhone users should get a discount on the iPad data plan. Comment on the article ....

    Read the article

  • Use your own domain email and tired of SPAM? SPAMfighter FTW

    - by Dave Campbell
    I wouldn't post this if I hadn't tried it... and I paid for it myself, so don't anybody be thinking I'm reviewing something someone sent me! Long ago and far away I got very tired of local ISPs and 2nd phone lines and took the plunge and got hooked up to cable... yeah I know the 2nd phone line concept may be hard for everyone to understand, but that's how it was in 'the old days'. To avoid having to change email addresses all the time, I decided to buy a domain name, get minimal hosting, and use that for all email into the house. That way if I changed providers, all the email addresses wouldn't have to change. Of course, about a dozen domains later, I have LOTS of pop email addresses and even an exchange address to my client's server... times have changed. What also has changed is the fact that we get SPAM... 'back in the day' when I was a beta tester for the first ISP in Phoenix, someone tried sending an ad to all of us, and what he got in return for his trouble was a bunch of core dumps that locked up his email... if you don't know what a core dump is, ask your grandfather. But in today's world, we're all much more civilized than that, and as with many things, the criminals seem to have much more rights than we do, so we get inundated with email offering all sorts of wild schemes that you'd have to be brain-dead to accept, but yet... if people weren't accepting them, they'd stop sending them. I keep hoping that survival of the smartest would weed out the mental midgets that respond and then the jumk email stop, but that hasn't happened yet anymore than finding high-quality hearing aids at the checkout line of Safeway because of all the dimwits playing music too loud inside their car... but that's another whole topic and I digress. So what's the solution for all the spam? And I mean *all*... on that old personal email address, I am now getting over 150 spam messages a day! Yes I know that's why God invented the delete key, but I took it on as a challenge, and it's a matter of principle... why should I switch email addresses, or convert from [email protected] to something else, or have all my email filtered through some service just because some A-Hole somewhere has a site up trying to phish Ma & Pa Kettle (ask your grandfather about that too) out of their retirement money? Well... I got an email from my cousin the other day while I was writing yet another email rule, and there was a banner on the bottom of his email that said he was protected by SPAMfighter. SPAMfighter huh.... so I took a look at their site, and found yet one more of the supposed tools to help us. But... I read that they're a Microsoft Gold Partner... and that doesn't come lightly... so I took a gamble and here's what I found: I installed it, and had to do a couple things: 1) SPAMfighter stuffed the SPAMfighter folder into my client's exchange address... I deleted it, made a new SPAMfighter folder where I wanted it to go, then in the SPAMfighter Clients settings for Outlook, I told it to put all spam there. 2) It didn't seem to be doing anything. There's a ribbon button that you can select "Block", and I did that, wondering if I was 'training' it, but it wasn't picking up duplicates 3) I sent email to support, and wrote a post on the forum (not to self: reply to that post). By the time the folks from the home office responded, it was the next day, and first up, SPAMfighter knocked down everything that came through when Outlook opend... two thumbs up! I disabled my 'garbage collection' rule from Outlook, and told Outlook not to use the junk folder thinking it was interfering. 4) Day 2 seemed to go about like Day 1... but I hung in there. 5) Day 3 is now a whole new day... I had left Outlook open and hadn't looked at the PC since sometime late yesterday afternoon, and when I looked this morning, *every bit* of spam was in the SPAMfighter folder!! I'm a new paying customer After watching SPAMfighter work this morning, I've purchased a 1-year license, and I now can sit and watch as emails come in and disappear from my inbox into the SPAMfighter folder. No more continual tweaking of the rules. I've got SPAMfighter set to 'Very Hard' filtering... personally I'd rather pull the few real emails out of the SPAMfighter folder than pull spam out of the real folders. Yes this is simply another way of using the delete key, but you know what? ... it feels good :) Here's a screenshot of the stats after just about 48 hours of being onboard: Note that all the ones blocked by me were during Day 1 and 2... I've blocked none today, and everything is blocked. Stay in the 'Light!

    Read the article

  • CodePlex Daily Summary for Wednesday, June 26, 2013

    CodePlex Daily Summary for Wednesday, June 26, 2013Popular ReleasesNaked Objects: Naked Objects Release 5.5.0: This release includes a number of significant improvements to the usability of the UI, some of which involve new programming conventions or attributes: Action dialogs now appear as pop-up modal dialogs instead of as a new page; query-only actions have an Apply as well as an OK button. See https://nakedobjects.codeplex.com/workitem/175 When a reference object is expanded in-line there is a button to jump straight to an Edit view of that object see https://nakedobjects.codeplex.com/workitem/1...VeraCrypt: VeraCrypt version 1.0b: Changes since version 1.0a :Enhance RIPEMD160 implementation in BootLoaded by using the compiler uint32 type Don't position legacy flag in volume header for newer VeraCrypt releasesPlayer Framework by Microsoft: Player Framework for Windows 8 and WP8 (v1.3 beta): Preview: New MPEG DASH adaptive streaming plugin for WAMS. Preview: New Ultraviolet CFF plugin. Preview: New WP7 version with WP8 compatibility. (source code only) Source code is now available via CodePlex Git Misc bug fixes and improvements: WP8 only: Added optional fullscreen and mute buttons to default xaml JS only: protecting currentTime from returning infinity. Some videos would cause currentTime to be infinity which could cause errors in plugins expecting only finite values. (...SSIS DQS Matching Transformation: SSIS DQS Matching Transformation 1.0: Initial release of the SSIS DQS Matching Component.AssaultCube Reloaded: 2.5.8: SERVER OWNERS: note that the default maprot has changed once again. Linux has Ubuntu 11.10 32-bit precompiled binaries and Ubuntu 10.10 64-bit precompiled binaries, but you can compile your own as it also contains the source. If you are using Mac or other operating systems, please wait while we continue to try to package for those OSes. Or better yet, try to compile it. If it fails, download a virtual machine. The server pack is ready for both Windows and Linux, but you might need to compi...Compare .NET Objects: Version 1.7.2.0: If you like it, please rate it. :) Performance Improvements Fix for deleted row in a data table Added ability to ignore the collection order Fix for Ignoring by AttributesMicrosoft Ajax Minifier: Microsoft Ajax Minifier 4.95: update parser to allow for CSS3 calc( function to nest. add recognition of -pponly (Preprocess-Only) switch in AjaxMinManifestTask build task. Fix crashing bug in EXE when processing a manifest file using the -xml switch and an error message needs to be displayed (like a missing input file). Create separate Clean and Bundle build tasks for working with manifest files (AjaxMinManifestCleanTask and AjaxMinBundleTask). Removed the IsCleanOperation from AjaxMinManifestTask -- use AjaxMinMan...VG-Ripper & PG-Ripper: VG-Ripper 2.9.44: changes NEW: Added Support for "ImgChili.net" links FIXED: Auto UpdaterDocument.Editor: 2013.25: What's new for Document.Editor 2013.25: Improved Spell Check support Improved User Interface Minor Bug Fix's, improvements and speed upsStyleMVVM: 3.0.2: This is a minor feature and bug fix release Features: ExportWhenDebuggerIsAttacedAttribute - new attribute that marks an attribute to only be exported when the debugger is attahced InjectedFilterAttributeFilterProvider - new Attribute Filter provider for MVC that injects the attributes Performance Improvements - minor speed improvements all over, and Import collections is now 50% faster Bug Fixes: Open Generic Constraints are now respected when finding exports Fix for fluent registrat...WPF Composites: Version 4.3.0: In this Beta release, I broke my code out into two separate projects. There is a core FasterWPF.dll with the minimal required functionality. This can run with only the Aero.dll and the Rx .dll's. Then, I have a FasterWPFExtras .dll that requires and supports the Extended WPF Toolkit™ Community Edition V 1.9.0 (including Xceed DataGrid) and the Thriple .dll. This is for developers who want more . . . Finally, you may notice the other OPTIONAL .dll's available in the download such as the Dyn...Channel9's Absolute Beginner Series: Windows Phone 8: Entire source code for the Channel 9 series, Windows Phone 8 Development for Absolute Beginners.Indent Guides for Visual Studio: Indent Guides v13: ImportantThis release does not support Visual Studio 2010. The latest stable release for VS 2010 is v12.1. Version History Changed in v13 Added page width guide lines Added guide highlighting options Fixed guides appearing over collapsed blocks Fixed guides not appearing in newly opened files Fixed some potential crashes Fixed lines going through pragma statements Various updates for VS 2012 and VS 2013 Removed VS 2010 support Changed in v12.1: Fixed crash when unable to start...Fluent Ribbon Control Suite: Fluent Ribbon Control Suite 2.1.0 - Prerelease d: Fluent Ribbon Control Suite 2.1.0 - Prerelease d(supports .NET 3.5, 4.0 and 4.5) Includes: Fluent.dll (with .pdb and .xml) Showcase Application Samples (not for .NET 3.5) Foundation (Tabs, Groups, Contextual Tabs, Quick Access Toolbar, Backstage) Resizing (ribbon reducing & enlarging principles) Galleries (Gallery in ContextMenu, InRibbonGallery) MVVM (shows how to use this library with Model-View-ViewModel pattern) KeyTips ScreenTips Toolbars ColorGallery *Walkthrough (do...Magick.NET: Magick.NET 6.8.5.1001: Magick.NET compiled against ImageMagick 6.8.5.10. Breaking changes: - MagickNET.Initialize has been made obsolete because the ImageMagick files in the directory are no longer necessary. - MagickGeometry is no longer IDisposable. - Renamed dll's so they include the platform name. - Image profiles can now only be accessed and modified with ImageProfile classes. - Renamed DrawableBase to Drawable. - Removed Args part of PathArc/PathCurvetoArgs/PathQuadraticCurvetoArgs classes. The...Keyboard Image Viewer: 1.5.4: Upgraded folder picker dialog to better version on Win7+ Fixed bug that stopped slideshow from looping back to the start of the list. Added crash dialog that allows you to see and copy exception stack traces for fixing.DependencyAnalysis (Egg and Gherkin): 0.9.4: - Create Visual Studio 2012 Addin for ad-hoc analysis of your project - Display metrics in a grid - Adequate performing serialization between Addin (Visual Studio process) and AnalysisHost process - Display dependencies as graph (proximity graph) - Create a logo for the project - Constructors of anonymous types no longer hide constructors of the declaring type during "build dependencies" phase - Type descriptors were added multiple times to SubmoduleDescriptor. Types collection, same instanc...Bloomberg API Emulator: Bloomberg API Emulator (v 1.0.5): This version contains the full Java port of my original C# code. I just finished the MarketDataSubscription request type. I will start working on a C++ port of my C# code.Three-Dimensional Maneuver Gear for Minecraft: TDMG 1.1.0.0 for 1.5.2: CodePlex???(????????) ?????????(???1/4) ??????????? ?????????? ???????????(??????????) ??????????????????????? ↑????、?????????????????????(???????) ???、??????????、?????????????????????、????????1.5?????????? Shift+W(????)??????????????????10°、?10°(?????????)???Hyper-V Management Pack Extensions 2012: HyperVMPE2012 (v1.0.1.126): Hyper-V Management Pack Extensions 2012 Beta ReleaseNew Projects.Net Encryption App: This is a C#.Net desktop application that will let users encrypt and de-crypt files with the algorithm of their choosing.AutoSPDocumenter: AutoSPDocumenter utilises PowerShell to document SharePoint farms and provide output in usable formats.Azure Storage Redirector: Azure Storage Redirector. Redirects requests to Global Azure Storage to China Azure Storage. ?????Azure Storage?request??????Azure storagebrownbag: Simple project to show branching, merging, and shelvingChannel9's Absolute Beginner Series: Channel 9's absolute beginner series source code. From Windows Phone 7, Windows Phone 8, Windows Store applications, one stop area for all the seriesFAST for Sharepoint 2010 Query Tool (.NET 3.5): .NET 3.5 version of the FAST Search for SharePoint MOSS 2010 Query Tool (https://fastforsharepoint.codeplex.com/). For environments without .NET 4.0HAOest Framework: HAOest????ListManager: ListManager????? by HADB of HAOestMyPS: mypsNNRel: NNRelO - BV - 2: TestOpen XML SDK for JavaScript: Small JavaScript library that enables you to implement Open XML functionality anywhere you can use JavaScript.Orchard Prefix free: Provides a script manifest for the Prefix free script libraries.PVDesktop: PVDesktop is an application for designing and analyzing specific solar energy sites.Red the sound TowePlay: School project at ISEN Lille. Creation of a collaborative music creation software in C# using.NETScience Kits for Kids: This project is the service for Childhood Education which aged 4 - 8.SharePoint ULS for PowerShell: Allows PowerShell to log to the SharePoint ULS.SSIS DQS Matching Transformation: The SSIS DQS Matching Transformation uses Data Quality Services (DQS) to find duplicate data within the SSIS data flow.TARVOS Computer Networks Simulator: Discrete event-based network simulator, supports simulating MPLS architecture, several RSVP-TE protocol functionalities and fast recovery.Web API Explorer 4 DNN: Web API Explorer for DNN(R) aids module development allowing you to examine the Routing Table entries for a DotNetNuke(R) portal.

    Read the article

  • Eloqua Experience 2013: Mystique, Modern Marketing and Masterful Engagement

    - by Mike Stiles
    The following is a guest post from Erick Mott, a social business leader at Oracle Eloqua. There’s a growing gap between 20th century marketing and a modern marketing way of doing business. I can’t think of a better example of modern marketing in action than what more than 2,000 people experienced in San Francisco at #EE13; customer-obsession, multichannel content, and real-time engagement all coming together at one extraordinary event. This was my first Eloqua Experience as a new Oracle Eloqua employee. In weeks prior, I heard about the mystique but didn’t know what to expect. What I’ve come to understand with more clarity is everything we do revolves around customer success, and we operate and educate at all times with these five tenets in mind: 1. Targeting: Really Know Your Buyer 2. Engagement: Create a 1:1 Relationship 3. Conversion: Visualize Guided Thinking 4. Analysis: Learn What’s Working 5. Marketing Technology: Enable and Extend the Cloud Product News from Eloqua Experience 2013 We made some announcements that John Stetic, VP of Products, Oracle Eloqua covers in this brief ‘Modern Marketing Minute’ video recorded after Wednesday’s keynote; summarized below, too: Oracle Eloqua AdFocus: While understanding the impact of a specific marketing channel was formerly relegated to marketers’ wish lists, the channels we now focus on are digital, social, and mobile. AdFocus gives marketers a single platform to dynamically create, manage and measure display ads alongside owned and earned media. AdFocus enables marketers to target only key accounts or prospects you want to reach with display ads, as well as provide creative content or personalized ad copy based on their persona and activities. Oracle Eloqua Profiler: The details of what we now know about customers have expanded into a universal customer profile, which can be used to create highly targeted segments. Marketers now can take data that’s not even stored in Eloqua to help targeted and score prospects for a complete, multichannel view of the customer. Profiler gives sales reps one, detailed view of the prospect to extend views beyond Oracle Eloqua asset activity (emails, forms, page views) to any external assets stored in Oracle Eloqua. Marketing Resource Management: New capabilities create more secure and controlled access to marketing resources and data. New integrations provide greater insight into campaign resources and management through a central marketing calendar and simplify resource management. Integrated Sales and Marketing Funnel: An integrated sales and marketing funnel view gives marketing and sales users, cross-functional teams, and executive management a consistent and clear view of pipeline performance. It also quickly provides users with historical metrics across different time spans and conditions. Eloqua AppCloud: More than 20 new AppCloud partners have been added to the community, which now includes 100+ apps. Eloqua AppCloud now provides modern marketers with an even broader range of marketing applications that help expand and enrich sales and marketing efforts; easily accessible in the Topliners Community. Social Capabilities: Recent integration between Oracle Eloqua and Oracle Social Relationship Management (SRM) deliver a comprehensive, scalable and integrated modern marketing solution. New capabilities include better tracking of social activities for a more complete customer profile. Engage Facebook custom audiences with AdFocus to deliver ads and meaningful experiences through trusted social networks. Biggest and Best Eloqua Experience. There’s a lot of talk in the industry about the Marketing Cloud. At Oracle Eloqua, we have been on a mission of delivering the most advanced and integrated modern marketing technology on the planet. It’s not just a concept but reality with proven execution, as seen first-hand this week in San Francisco. In this video, Kevin Akeroyd, SVP of Oracle Eloqua, provides some highlights of what made this year’s Eloqua Experience, exceptional, including Steve Woods’ presentation about the journey of modern marketers and Andrea Ward’s conversation with Vince Gilligan, creator of the Breaking Bad television series. The 2013 Markie Awards The Oracle Eloqua Marketing Cloud was best exemplified for me as 19 Markies were awarded to customers for their exceptional creativity and results as modern marketers. Wow, what a night to remember with so many committed and talented people working to create an extraordinary experience! To learn more about how to become a modern marketer, check out these resources. We look forward to seeing you next year at Eloqua Experience. More on Erick: 20 years experience at Oracle, Ektron, Sitecore, Lyris, Habeas, Nokia, creatorbase, Mark Monitor, Cisco Systems, GlobalFluency, Sun Microsystems, Philips NV, Elm Products and CBS TV. Patent holder with agency, Fortune 500, media, and startup company expertise. @mikestiles

    Read the article

  • Accessing Oracle DB through SQL Server using OPENROWSET

    - by Ken Paul
    I'm trying to access a large Oracle database through SQL Server using OPENROWSET in client-side Javascript, and not having much luck. Here are the particulars: A SQL Server view that accesses the Oracle database using OPENROWSET works perfectly, so I know I have valid connection string parameters. However, the new requirement is for extremely dynamic Oracle queries that depend on client-side selections, and I haven't been able to get dynamic (or even parameterized) Oracle queries to work from SQL Server views or stored procedures. Client-side access to the SQL Server database works perfectly with dynamic and parameterized queries. I cannot count on clients having any Oracle client software. Therefore, access to the Oracle database has to be through the SQL Server database, using views, stored procedures, or dynamic queries using OPENROWSET. Because the SQL Server database is on a shared server, I'm not allowed to use globally-linked databases. My idea was to define a function that would take my own version of a parameterized Oracle query, make the parameter substitutions, wrap the query in an OPENROWSET, and execute it in SQL Server, returning the resulting recordset. Here's sample code: // db is a global variable containing an ADODB.Connection opened to the SQL Server DB // rs is a global variable containing an ADODB.Recordset . . . ss = "SELECT myfield FROM mytable WHERE {param0} ORDER BY myfield;"; OracleQuery(ss,["somefield='" + somevalue + "'"]); . . . function OracleQuery(sql,params) { var s = sql; var i; for (i = 0; i < params.length; i++) s = s.replace("{param" + i + "}",params[i]); var e = "SELECT * FROM OPENROWSET('MSDAORA','(connect-string-values)';" + "'user';'pass','" + s.split("'").join("''") + "') q"; try { rs.Open("EXEC ('" + e.split("'").join("''") + "')",db); } catch (eobj) { alert("SQL ERROR: " + eobj.description + "\nSQL: " + e); } } The SQL error that I'm getting is Ad hoc access to OLE DB provider 'MSDAORA' has been denied. You must access this provider through a linked server. which makes no sense to me. The Microsoft explanation for this error relates to a registry setting (DisallowAdhocAccess). This is set correctly on my PC, but surely this relates to the DB server and not the client PC, and I would expect that the setting there is correct since the view mentioned above works. One alternative that I've tried is to eliminate the enclosing EXEC in the Open statement: rs.Open(e,db); but this generates the same error. I also tried putting the OPENROWSET in a stored procedure. This works perfectly when executed from within SQL Server Management Studio, but fails with the same error message when the stored procedure is called from Javascript. Is what I'm trying to do possible? If so, can you recommend how to fix my code? Or is a completely different approach necessary? Any hints or related information will be welcome. Thanks in advance.

    Read the article

  • Grafting LINQ onto C# 2 library

    - by P Daddy
    I'm writing a data access layer. It will have C# 2 and C# 3 clients, so I'm compiling against the 2.0 framework. Although encouraging the use of stored procedures, I'm still trying to provide a fairly complete ability to perform ad-hoc queries. I have this working fairly well, already. For the convenience of C# 3 clients, I'm trying to provide as much compatibility with LINQ query syntax as I can. Jon Skeet noticed that LINQ query expressions are duck typed, so I don't have to have an IQueryable and IQueryProvider (or IEnumerable<T>) to use them. I just have to provide methods with the correct signatures. So I got Select, Where, OrderBy, OrderByDescending, ThenBy, and ThenByDescending working. Where I need help are with Join and GroupJoin. I've got them working, but only for one join. A brief compilable example of what I have is this: // .NET 2.0 doesn't define the Func<...> delegates, so let's define some workalikes delegate TResult FakeFunc<T, TResult>(T arg); delegate TResult FakeFunc<T1, T2, TResult>(T1 arg1, T2 arg2); abstract class Projection{ public static Condition operator==(Projection a, Projection b){ return new EqualsCondition(a, b); } public static Condition operator!=(Projection a, Projection b){ throw new NotImplementedException(); } } class ColumnProjection : Projection{ readonly Table table; readonly string columnName; public ColumnProjection(Table table, string columnName){ this.table = table; this.columnName = columnName; } } abstract class Condition{} class EqualsCondition : Condition{ readonly Projection a; readonly Projection b; public EqualsCondition(Projection a, Projection b){ this.a = a; this.b = b; } } class TableView{ readonly Table table; readonly Projection[] projections; public TableView(Table table, Projection[] projections){ this.table = table; this.projections = projections; } } class Table{ public Projection this[string columnName]{ get{return new ColumnProjection(this, columnName);} } public TableView Select(params Projection[] projections){ return new TableView(this, projections); } public TableView Select(FakeFunc<Table, Projection[]> projections){ return new TableView(this, projections(this)); } public Table Join(Table other, Condition condition){ return new JoinedTable(this, other, condition); } public TableView Join(Table inner, FakeFunc<Table, Projection> outerKeySelector, FakeFunc<Table, Projection> innerKeySelector, FakeFunc<Table, Table, Projection[]> resultSelector){ Table join = new JoinedTable(this, inner, new EqualsCondition(outerKeySelector(this), innerKeySelector(inner))); return join.Select(resultSelector(this, inner)); } } class JoinedTable : Table{ readonly Table left; readonly Table right; readonly Condition condition; public JoinedTable(Table left, Table right, Condition condition){ this.left = left; this.right = right; this.condition = condition; } } This allows me to use a fairly decent syntax in C# 2: Table table1 = new Table(); Table table2 = new Table(); TableView result = table1 .Join(table2, table1["ID"] == table2["ID"]) .Select(table1["ID"], table2["Description"]); But an even nicer syntax in C# 3: TableView result = from t1 in table1 join t2 in table2 on t1["ID"] equals t2["ID"] select new[]{t1["ID"], t2["Description"]}; This works well and gives me identical results to the first case. The problem is if I want to join in a third table. TableView result = from t1 in table1 join t2 in table2 on t1["ID"] equals t2["ID"] join t3 in table3 on t1["ID"] equals t3["ID"] select new[]{t1["ID"], t2["Description"], t3["Foo"]}; Now I get an error (Cannot implicitly convert type 'AnonymousType#1' to 'Projection[]'), presumably because the second join is trying to join the third table to an anonymous type containing the first two tables. This anonymous type, of course, doesn't have a Join method. Any hints on how I can do this?

    Read the article

  • IF adding new Entity gives error me : EntityCommandCompilationException was unhandled bu user code

    - by programmerist
    i have 5 tables in started projects. if i adds new table (Urun enttiy) writing below codes: project.BAL : public static List<Urun> GetUrun() { using (GenoTipSatisEntities genSatisUrunCtx = new GenoTipSatisEntities()) { ObjectQuery<Urun> urun = genSatisUrunCtx.Urun; return urun.ToList(); } } if i receive data form BAL in UI.aspx: using project.BAL; namespace GenoTip.Web.ContentPages.Satis { public partial class SatisUrun : System.Web.UI.Page { protected void Page_Load(object sender, EventArgs e) { if (!IsPostBack) { FillUrun(); } } void FillUrun() { ddlUrun.DataSource = SatisServices.GetUrun(); ddlUrun.DataValueField = "ID"; ddlUrun.DataTextField = "Ad"; ddlUrun.DataBind(); } } } i added URun later. error appears ToList method: EntityCommandCompilationException was unhandled bu user code error Detail: Error 1 Error 3007: Problem in Mapping Fragments starting at lines 659, 873: Non-Primary-Key column(s) [UrunID] are being mapped in both fragments to different conceptual side properties - data inconsistency is possible because the corresponding conceptual side properties can be independently modified. C:\Users\pc\Desktop\GenoTip.Satis\GenoTip.DAL\ModelSatis.edmx 660 15 GenoTip.DAL Error 2 Error 3012: Problem in Mapping Fragments starting at lines 659, 873: Data loss is possible in FaturaDetay.UrunID. An Entity with Key (PK) will not round-trip when: (PK does NOT play Role 'FaturaDetay' in AssociationSet 'FK_FaturaDetay_Urun' AND PK is in 'FaturaDetay' EntitySet) C:\Users\pc\Desktop\GenoTip.Satis\GenoTip.DAL\ModelSatis.edmx 874 11 GenoTip.DAL Error 3 Error 3012: Problem in Mapping Fragments starting at lines 659, 873: Data loss is possible in FaturaDetay.UrunID. An Entity with Key (PK) will not round-trip when: (PK is in 'FaturaDetay' EntitySet AND PK does NOT play Role 'FaturaDetay' in AssociationSet 'FK_FaturaDetay_Urun' AND Entity.UrunID is not NULL) C:\Users\pc\Desktop\GenoTip.Satis\GenoTip.DAL\ModelSatis.edmx 660 15 GenoTip.DAL Error 4 Error 3007: Problem in Mapping Fragments starting at lines 748, 879: Non-Primary-Key column(s) [UrunID] are being mapped in both fragments to different conceptual side properties - data inconsistency is possible because the corresponding conceptual side properties can be independently modified. C:\Users\pc\Desktop\GenoTip.Satis\GenoTip.DAL\ModelSatis.edmx 749 15 GenoTip.DAL Error 5 Error 3012: Problem in Mapping Fragments starting at lines 748, 879: Data loss is possible in Satis.UrunID. An Entity with Key (PK) will not round-trip when: (PK does NOT play Role 'Satis' in AssociationSet 'FK_Satis_Urun' AND PK is in 'Satis' EntitySet) C:\Users\pc\Desktop\GenoTip.Satis\GenoTip.DAL\ModelSatis.edmx 880 11 GenoTip.DAL Error 6 Error 3012: Problem in Mapping Fragments starting at lines 748, 879: Data loss is possible in Satis.UrunID. An Entity with Key (PK) will not round-trip when: (PK is in 'Satis' EntitySet AND PK does NOT play Role 'Satis' in AssociationSet 'FK_Satis_Urun' AND Entity.UrunID is not NULL) C:\Users\pc\Desktop\GenoTip.Satis\GenoTip.DAL\ModelSatis.edmx 749 15 GenoTip.DAL

    Read the article

  • Team Foundation Server Setup/Access

    - by Angel Brighteyes
    What I need: A TFS 2010 Setup that allows 2 application developers to access the TFS from remote locations. How it is setup: Server 2008 Standard 2g Ram 300g HD space SharePoint Server 2007, using SQL Server 2005 SQL Server 2008 Standard Team Foundation Server 2010 IIS 7 Sharepoint Bindings: TFS.DynAccount.Me:80; TFS:80 TFS Bindings: TFS.DynAccount.Me:8080; TFS:8080 Using DynDNS service to account for the dynamic ip address being used, this is a requirement for the moment until I can get a better isp package. Access using Local Accounts Server is not setup on a domain, or as a domain. Consequently I did not setup AD services. Problem: When logged into TFS using my credentials TFS\AdminUser through the DynDNS account TFS.DynAccount.Me I recieve the 'Red X of Death' on the Documents and Reports folder. When logged into the TFS through the local peer to peer network using the same credentials TFS\AdminUser I do not receive the 'Red X of Death' problem. Further Troubleshooting: When users 'Right Click' the 'TeamProject1' Click 'Show Project Portal' it tries to take them to http://TFS:8080 instead of http://TFS.DynAccount.Me:8080, which doing further research I am assuming that it is because team foundation server was setup with a local name of TFS instead of 'TFS.DynAccount.Me' as specified here in Visual Studio Magazines: The Red X of Death. Users can Access the Team Portal for SharePoint via http://TFS.DynAccount.Me/TeamCollection/TeamProject so it is not like we are dead in the water or anything. However, as most employees/staff are prone to do, they have expressed a great distaste for having to do it this way and just be patient until the current project is finished since we are under a very strict deadline. Is there a way to set this up differently, or change some settings someplace, reinstall it, point a CName record for our domain website to the DynAccount (e.g. TFS.OurDomain.com points to TFS.DynAccount.Me, which consequently does allow access to the http site without issues), or something. I really don't feel like after all the time and effort I have spent into, first the cost, second the bloody install, third learning SharePoint well enough, fourth the hours into days spent on this, fifth more troubleshooting, sixth employee headaches to just let it lay where it is at. I figure in my spare/off time I would keep trying to get this to work. So I really appreciate any help any one can give me. I know this is probably something really stupid simple that I will 'Face Palm' over, but at the moment the stress and frustration just has me beat. Thank you again, this community has always been a great help.

    Read the article

  • Asp.net MVC and MOSS 2010 integration

    - by Robert Koritnik
    Just a sidenote: I'm not sure whether I should post this to serverfault as well, because some MOSS admin may have some info for me as well? A bit of explanation first (without Asp.net MVC) Is it possible to integrate the two? Is it possible to write an application that would share at least credential information with MOSS? I have to write a MOSS application that has to do with these technologies: MOSS 2010 Personal client certificates authentication (most probably on USB keys) Active Directory Federation Services Separate SQL DB that would serve application specific data (separate as not being part of MOSS DB) How should it work? Users should authenticate using personal certificates into MOSS 2010 There would be a certain part of MOSS that would be related to my custom application This application should only authorize certain users via AD FS - I guess these users should have a certain security claim attached to them This application should manage users (that have access to this app) with additional (app specific) security claims related to this application (as additional application level authorization rights for individual application parts) This application should use custom SQL 2008 DB heavily with its own data This application should have the possibility to integrate with external systems as well (Exchange for instance to inject calendar entries, ERP systems etc) This application should be able to export its data (from its DB) to files. I don't know if it's possible, but it would be nice if the app could add these files to MOSS and attach authorization info to them so only users with sufficient rights would be able to view/open these files. Why Asp.net MVC then? I'm very well versed in Asp.net MVC (also with the latest version) and I haven't done anything on Sharepoint since version 2003 (which doesn't do me no good or prepare me for the latest version in any way shape or form). This project will most probably be a death march project so I would rather write my application as a UI rich Asp.net MVC application and somehow integrate it into MOSS. But not only via a link, because I would like to at least share credentials, so users wouldn't need to re-login when accessing my app. Using Asp.net MVC I would at least have the possibility to finish on time or be less death marching. Is this at all possible? Questions Is it possible to integrate Asp.net MVC into MOSS as described above? If integration is not possible, would it be possible to create a completely MOSS based application that would work as described? Which parts of MOSS 2010 should I use to accomplish what I need?

    Read the article

  • Find out CRC or CHECKSUM of RS232 data

    - by Carlos Alloatti
    I need to communicate with a RS232 device, I have no specs or information available. I send a 16 byte command and get a 16 byte result back. The last byte looks like some kind of crc or checksum, I have tried using this http://miscel.dk/MiscEl/miscelCRCandChecksum.html with no luck. Anyone can reverse engineer the crc/checksum algorithm? here is some data captured with an RS-232 monitor program: 01 80 42 00 00 00 00 00 00 00 00 00 00 00 01 B3 01 80 42 00 00 00 00 00 00 00 00 00 00 00 02 51 01 80 42 00 00 00 00 00 00 00 00 00 00 00 03 0F 01 80 42 00 00 00 00 00 00 00 00 00 00 00 04 8C 01 80 42 00 00 00 00 00 00 00 00 00 00 00 05 D2 01 80 42 00 00 00 00 00 00 00 00 00 00 00 06 30 01 80 42 00 00 00 00 00 00 00 00 00 00 00 07 6E 01 80 42 00 00 00 00 00 00 00 00 00 00 00 08 2F 01 80 42 00 00 00 00 00 00 00 00 00 00 00 09 71 01 80 42 00 00 00 00 00 00 00 00 00 00 00 0A 93 01 80 42 00 00 00 00 00 00 00 00 00 00 00 0B CD 01 80 42 00 00 00 00 00 00 00 00 00 00 00 0C 4E 01 80 42 00 00 00 00 00 00 00 00 00 00 00 0D 10 01 80 42 00 00 00 00 00 00 00 00 00 00 00 0E F2 01 80 42 00 00 00 00 00 00 00 00 00 00 00 0F AC 01 80 42 00 00 00 00 00 00 00 00 00 00 00 10 70 01 80 42 00 00 00 00 00 00 00 00 00 00 00 11 2E 01 80 42 00 00 00 00 00 00 00 00 00 00 00 12 CC 01 80 42 00 00 00 00 00 00 00 00 00 00 00 13 92 01 80 42 00 00 00 00 00 00 00 00 00 00 00 14 11 01 80 42 00 00 00 00 00 00 00 00 00 00 00 15 4F 01 80 42 00 00 00 00 00 00 00 00 00 00 00 16 AD 01 80 42 00 00 00 00 00 00 00 00 00 00 00 17 F3 01 80 42 00 00 00 00 00 00 00 00 00 00 00 18 B2 01 80 42 00 00 00 00 00 00 00 00 00 00 00 19 EC 01 80 42 00 00 00 00 00 00 00 00 00 00 00 1A 0E 01 80 42 00 00 00 00 00 00 00 00 00 00 00 1B 50 01 80 42 00 00 00 00 00 00 00 00 00 00 00 1C D3 01 80 42 00 00 00 00 00 00 00 00 00 00 00 1D 8D 01 80 42 00 00 00 00 00 00 00 00 00 00 00 1E 6F 01 80 42 00 00 00 00 00 00 00 00 00 00 00 1F 31 01 80 42 00 00 00 00 00 00 00 00 00 00 00 20 CE 01 80 42 00 00 00 00 00 00 00 00 00 00 00 21 90 01 80 42 00 00 00 00 00 00 00 00 00 00 00 22 72 01 80 42 00 00 00 00 00 00 00 00 00 00 00 23 2C 01 80 42 00 00 00 00 00 00 00 00 00 00 00 24 AF 01 80 42 00 00 00 00 00 00 00 00 00 00 00 25 F1 01 80 42 00 00 00 00 00 00 00 00 00 00 00 26 13 01 80 42 00 00 00 00 00 00 00 00 00 00 00 27 4D 01 80 42 00 00 00 00 00 00 00 00 00 00 00 28 0C 01 80 42 00 00 00 00 00 00 00 00 00 00 00 29 52 01 80 42 00 00 00 00 00 00 00 00 00 00 00 2A B0 01 80 42 00 00 00 00 00 00 00 00 00 00 00 2B EE 01 80 42 00 00 00 00 00 00 00 00 00 00 00 2C 6D 01 80 42 00 00 00 00 00 00 00 00 00 00 00 2D 33 01 80 42 00 00 00 00 00 00 00 00 00 00 00 2E D1 01 80 42 00 00 00 00 00 00 00 00 00 00 00 2F 8F 01 80 42 00 00 00 00 00 00 00 00 00 00 00 30 53 01 80 42 00 00 00 00 00 00 00 00 00 00 00 31 0D 01 80 42 00 00 00 00 00 00 00 00 00 00 00 32 EF 01 80 42 00 00 00 00 00 00 00 00 00 00 00 33 B1 01 80 42 00 00 00 00 00 00 00 00 00 00 00 34 32 01 80 42 00 00 00 00 00 00 00 00 00 00 00 35 6C 01 80 42 00 00 00 00 00 00 00 00 00 00 00 36 8E 01 80 42 00 00 00 00 00 00 00 00 00 00 00 37 D0 01 80 42 00 00 00 00 00 00 00 00 00 00 00 38 91 01 80 42 00 00 00 00 00 00 00 00 00 00 00 39 CF 01 80 42 00 00 00 00 00 00 00 00 00 00 00 3A 2D 01 80 42 00 00 00 00 00 00 00 00 00 00 00 3B 73 01 80 42 00 00 00 00 00 00 00 00 00 00 00 3C F0 01 80 42 00 00 00 00 00 00 00 00 00 00 00 3D AE 01 80 42 00 00 00 00 00 00 00 00 00 00 00 3E 4C 01 80 42 00 00 00 00 00 00 00 00 00 00 00 3F 12 01 80 42 00 00 00 00 00 00 00 00 00 00 00 40 AB 01 80 42 00 00 00 00 00 00 00 00 00 00 00 41 F5 01 80 42 00 00 00 00 00 00 00 00 00 00 00 42 17 01 80 42 00 00 00 00 00 00 00 00 00 00 00 43 49 01 80 42 00 00 00 00 00 00 00 00 00 00 00 44 CA 01 80 42 00 00 00 00 00 00 00 00 00 00 00 45 94 01 80 42 00 00 00 00 00 00 00 00 00 00 00 46 76 01 80 42 00 00 00 00 00 00 00 00 00 00 00 47 28 01 80 42 00 00 00 00 00 00 00 00 00 00 00 48 69 01 80 42 00 00 00 00 00 00 00 00 00 00 00 49 37 01 80 42 00 00 00 00 00 00 00 00 00 00 00 4A D5 01 80 42 00 00 00 00 00 00 00 00 00 00 00 4B 8B 01 80 42 00 00 00 00 00 00 00 00 00 00 00 4C 08 01 80 42 00 00 00 00 00 00 00 00 00 00 00 4D 56 01 80 42 00 00 00 00 00 00 00 00 00 00 00 4E B4 01 80 42 00 00 00 00 00 00 00 00 00 00 00 4F EA 01 80 42 00 00 00 00 00 00 00 00 00 00 00 50 36 01 80 42 00 00 00 00 00 00 00 00 00 00 00 51 68 01 80 42 00 00 00 00 00 00 00 00 00 00 00 52 8A 01 80 42 00 00 00 00 00 00 00 00 00 00 00 53 D4 01 80 42 00 00 00 00 00 00 00 00 00 00 00 54 57 01 80 42 00 00 00 00 00 00 00 00 00 00 00 55 09 01 80 42 00 00 00 00 00 00 00 00 00 00 00 56 EB 01 80 42 00 00 00 00 00 00 00 00 00 00 00 57 B5 01 80 42 00 00 00 00 00 00 00 00 00 00 00 58 F4 01 80 42 00 00 00 00 00 00 00 00 00 00 00 59 AA 01 80 42 00 00 00 00 00 00 00 00 00 00 00 5A 48 01 80 42 00 00 00 00 00 00 00 00 00 00 00 5B 16 01 80 42 00 00 00 00 00 00 00 00 00 00 00 5C 95 01 80 42 00 00 00 00 00 00 00 00 00 00 00 5D CB 01 80 42 00 00 00 00 00 00 00 00 00 00 00 5E 29 01 80 42 00 00 00 00 00 00 00 00 00 00 00 5F 77 01 80 42 00 00 00 00 00 00 00 00 00 00 00 60 88 01 80 42 00 00 00 00 00 00 00 00 00 00 00 61 D6 01 80 42 00 00 00 00 00 00 00 00 00 00 00 62 34 01 80 42 00 00 00 00 00 00 00 00 00 00 00 63 6A 01 80 42 00 00 00 00 00 00 00 00 00 00 00 64 E9 01 80 42 00 00 00 00 00 00 00 00 00 00 00 65 B7 01 80 42 00 00 00 00 00 00 00 00 00 00 00 66 55 01 80 42 00 00 00 00 00 00 00 00 00 00 00 67 0B 01 80 42 00 00 00 00 00 00 00 00 00 00 00 68 4A 01 80 42 00 00 00 00 00 00 00 00 00 00 00 69 14 01 80 42 00 00 00 00 00 00 00 00 00 00 00 6A F6 01 80 42 00 00 00 00 00 00 00 00 00 00 00 6B A8 01 80 42 00 00 00 00 00 00 00 00 00 00 00 6C 2B 01 80 42 00 00 00 00 00 00 00 00 00 00 00 6D 75 01 80 42 00 00 00 00 00 00 00 00 00 00 00 6E 97 01 80 42 00 00 00 00 00 00 00 00 00 00 00 6F C9 01 80 42 00 00 00 00 00 00 00 00 00 00 00 70 15 01 80 42 00 00 00 00 00 00 00 00 00 00 00 71 4B 01 80 42 00 00 00 00 00 00 00 00 00 00 00 72 A9 01 80 42 00 00 00 00 00 00 00 00 00 00 00 73 F7 01 80 42 00 00 00 00 00 00 00 00 00 00 00 74 74 01 80 42 00 00 00 00 00 00 00 00 00 00 00 75 2A 01 80 42 00 00 00 00 00 00 00 00 00 00 00 76 C8 01 80 42 00 00 00 00 00 00 00 00 00 00 00 77 96 01 80 42 00 00 00 00 00 00 00 00 00 00 00 78 D7 01 80 42 00 00 00 00 00 00 00 00 00 00 00 79 89 01 80 42 00 00 00 00 00 00 00 00 00 00 00 7A 6B 01 80 42 00 00 00 00 00 00 00 00 00 00 00 7B 35 01 80 42 00 00 00 00 00 00 00 00 00 00 00 7C B6 01 80 42 00 00 00 00 00 00 00 00 00 00 00 7D E8 01 80 42 00 00 00 00 00 00 00 00 00 00 00 7E 0A 01 80 42 00 00 00 00 00 00 00 00 00 00 00 7F 54 01 80 42 00 00 00 00 00 00 00 00 00 00 00 80 61 01 80 42 00 00 00 00 00 00 00 00 00 00 00 81 3F 01 80 42 00 00 00 00 00 00 00 00 00 00 00 82 DD 01 80 42 00 00 00 00 00 00 00 00 00 00 00 83 83 01 80 42 00 00 00 00 00 00 00 00 00 00 00 84 00 01 80 42 00 00 00 00 00 00 00 00 00 00 00 85 5E 01 80 42 00 00 00 00 00 00 00 00 00 00 00 86 BC 01 80 42 00 00 00 00 00 00 00 00 00 00 00 87 E2 01 80 42 00 00 00 00 00 00 00 00 00 00 00 88 A3 01 80 42 00 00 00 00 00 00 00 00 00 00 00 89 FD 01 80 42 00 00 00 00 00 00 00 00 00 00 00 8A 1F 01 80 42 00 00 00 00 00 00 00 00 00 00 00 8B 41 01 80 42 00 00 00 00 00 00 00 00 00 00 00 8C C2 01 80 42 00 00 00 00 00 00 00 00 00 00 00 8D 9C 01 80 42 00 00 00 00 00 00 00 00 00 00 00 8E 7E 01 80 42 00 00 00 00 00 00 00 00 00 00 00 8F 20 01 80 42 00 00 00 00 00 00 00 00 00 00 00 90 FC 01 80 42 00 00 00 00 00 00 00 00 00 00 00 91 A2 01 80 42 00 00 00 00 00 00 00 00 00 00 00 92 40 01 80 42 00 00 00 00 00 00 00 00 00 00 00 93 1E 01 80 42 00 00 00 00 00 00 00 00 00 00 00 94 9D 01 80 42 00 00 00 00 00 00 00 00 00 00 00 95 C3 01 80 42 00 00 00 00 00 00 00 00 00 00 00 96 21 01 80 42 00 00 00 00 00 00 00 00 00 00 00 97 7F 01 80 42 00 00 00 00 00 00 00 00 00 00 00 98 3E 01 80 42 00 00 00 00 00 00 00 00 00 00 00 99 60 01 80 42 00 00 00 00 00 00 00 00 00 00 00 9A 82 01 80 42 00 00 00 00 00 00 00 00 00 00 00 9B DC 01 80 42 00 00 00 00 00 00 00 00 00 00 00 9C 5F 01 80 42 00 00 00 00 00 00 00 00 00 00 00 9D 01 01 80 42 00 00 00 00 00 00 00 00 00 00 00 9E E3 01 80 42 00 00 00 00 00 00 00 00 00 00 00 9F BD 01 80 42 00 00 00 00 00 00 00 00 00 00 00 A0 42 01 80 42 00 00 00 00 00 00 00 00 00 00 00 A1 1C 01 80 42 00 00 00 00 00 00 00 00 00 00 00 A2 FE 01 80 42 00 00 00 00 00 00 00 00 00 00 00 A3 A0 01 80 42 00 00 00 00 00 00 00 00 00 00 00 A4 23 01 80 42 00 00 00 00 00 00 00 00 00 00 00 A5 7D 01 80 42 00 00 00 00 00 00 00 00 00 00 00 A6 9F 01 80 42 00 00 00 00 00 00 00 00 00 00 00 A7 C1 01 80 42 00 00 00 00 00 00 00 00 00 00 00 A8 80 01 80 42 00 00 00 00 00 00 00 00 00 00 00 A9 DE 01 80 42 00 00 00 00 00 00 00 00 00 00 00 AA 3C 01 80 42 00 00 00 00 00 00 00 00 00 00 00 AB 62 01 80 42 00 00 00 00 00 00 00 00 00 00 00 AC E1 01 80 42 00 00 00 00 00 00 00 00 00 00 00 AD BF 01 80 42 00 00 00 00 00 00 00 00 00 00 00 AE 5D 01 80 42 00 00 00 00 00 00 00 00 00 00 00 AF 03 01 80 42 00 00 00 00 00 00 00 00 00 00 00 B0 DF 01 80 42 00 00 00 00 00 00 00 00 00 00 00 B1 81 01 80 42 00 00 00 00 00 00 00 00 00 00 00 B2 63 01 80 42 00 00 00 00 00 00 00 00 00 00 00 B3 3D 01 80 42 00 00 00 00 00 00 00 00 00 00 00 B4 BE 01 80 42 00 00 00 00 00 00 00 00 00 00 00 B5 E0 01 80 42 00 00 00 00 00 00 00 00 00 00 00 B6 02 01 80 42 00 00 00 00 00 00 00 00 00 00 00 B7 5C 01 80 42 00 00 00 00 00 00 00 00 00 00 00 B8 1D 01 80 42 00 00 00 00 00 00 00 00 00 00 00 B9 43 01 80 42 00 00 00 00 00 00 00 00 00 00 00 BA A1 01 80 42 00 00 00 00 00 00 00 00 00 00 00 BB FF 01 80 42 00 00 00 00 00 00 00 00 00 00 00 BC 7C 01 80 42 00 00 00 00 00 00 00 00 00 00 00 BD 22 01 80 42 00 00 00 00 00 00 00 00 00 00 00 BE C0 01 80 42 00 00 00 00 00 00 00 00 00 00 00 BF 9E 01 80 42 00 00 00 00 00 00 00 00 00 00 00 C0 27 01 80 42 00 00 00 00 00 00 00 00 00 00 00 C1 79 01 80 42 00 00 00 00 00 00 00 00 00 00 00 C2 9B 01 80 42 00 00 00 00 00 00 00 00 00 00 00 C3 C5 01 80 42 00 00 00 00 00 00 00 00 00 00 00 C4 46 01 80 42 00 00 00 00 00 00 00 00 00 00 00 C5 18 01 80 42 00 00 00 00 00 00 00 00 00 00 00 C6 FA 01 80 42 00 00 00 00 00 00 00 00 00 00 00 C7 A4 01 80 42 00 00 00 00 00 00 00 00 00 00 00 C8 E5 01 80 42 00 00 00 00 00 00 00 00 00 00 00 C9 BB 01 80 42 00 00 00 00 00 00 00 00 00 00 00 CA 59 01 80 42 00 00 00 00 00 00 00 00 00 00 00 CB 07 01 80 42 00 00 00 00 00 00 00 00 00 00 00 CC 84 01 80 42 00 00 00 00 00 00 00 00 00 00 00 CD DA 01 80 42 00 00 00 00 00 00 00 00 00 00 00 CE 38 01 80 42 00 00 00 00 00 00 00 00 00 00 00 CF 66 01 80 42 00 00 00 00 00 00 00 00 00 00 00 D0 BA 01 80 42 00 00 00 00 00 00 00 00 00 00 00 D1 E4 01 80 42 00 00 00 00 00 00 00 00 00 00 00 D2 06 01 80 42 00 00 00 00 00 00 00 00 00 00 00 D3 58 01 80 42 00 00 00 00 00 00 00 00 00 00 00 D4 DB 01 80 42 00 00 00 00 00 00 00 00 00 00 00 D5 85 01 80 42 00 00 00 00 00 00 00 00 00 00 00 D6 67 01 80 42 00 00 00 00 00 00 00 00 00 00 00 D7 39 01 80 42 00 00 00 00 00 00 00 00 00 00 00 D8 78 01 80 42 00 00 00 00 00 00 00 00 00 00 00 D9 26 01 80 42 00 00 00 00 00 00 00 00 00 00 00 DA C4 01 80 42 00 00 00 00 00 00 00 00 00 00 00 DB 9A 01 80 42 00 00 00 00 00 00 00 00 00 00 00 DC 19 01 80 42 00 00 00 00 00 00 00 00 00 00 00 DD 47 01 80 42 00 00 00 00 00 00 00 00 00 00 00 DE A5 01 80 42 00 00 00 00 00 00 00 00 00 00 00 DF FB 01 80 42 00 00 00 00 00 00 00 00 00 00 00 E0 04 01 80 42 00 00 00 00 00 00 00 00 00 00 00 E1 5A 01 80 42 00 00 00 00 00 00 00 00 00 00 00 E2 B8 01 80 42 00 00 00 00 00 00 00 00 00 00 00 E3 E6 01 80 42 00 00 00 00 00 00 00 00 00 00 00 E4 65 01 80 42 00 00 00 00 00 00 00 00 00 00 00 E5 3B 01 80 42 00 00 00 00 00 00 00 00 00 00 00 E6 D9 01 80 42 00 00 00 00 00 00 00 00 00 00 00 E7 87 01 80 42 00 00 00 00 00 00 00 00 00 00 00 E8 C6 01 80 42 00 00 00 00 00 00 00 00 00 00 00 E9 98 01 80 42 00 00 00 00 00 00 00 00 00 00 00 EA 7A 01 80 42 00 00 00 00 00 00 00 00 00 00 00 EB 24 01 80 42 00 00 00 00 00 00 00 00 00 00 00 EC A7 01 80 42 00 00 00 00 00 00 00 00 00 00 00 ED F9 01 80 42 00 00 00 00 00 00 00 00 00 00 00 EE 1B 01 80 42 00 00 00 00 00 00 00 00 00 00 00 EF 45 01 80 42 00 00 00 00 00 00 00 00 00 00 00 F0 99 01 80 42 00 00 00 00 00 00 00 00 00 00 00 F1 C7 01 80 42 00 00 00 00 00 00 00 00 00 00 00 F2 25 01 80 42 00 00 00 00 00 00 00 00 00 00 00 F3 7B 01 80 42 00 00 00 00 00 00 00 00 00 00 00 F4 F8 01 80 42 00 00 00 00 00 00 00 00 00 00 00 F5 A6 01 80 42 00 00 00 00 00 00 00 00 00 00 00 F6 44 01 80 42 00 00 00 00 00 00 00 00 00 00 00 F7 1A 01 80 42 00 00 00 00 00 00 00 00 00 00 00 F8 5B 01 80 42 00 00 00 00 00 00 00 00 00 00 00 F9 05 01 80 42 00 00 00 00 00 00 00 00 00 00 00 FA E7 01 80 42 00 00 00 00 00 00 00 00 00 00 00 FB B9 01 80 42 00 00 00 00 00 00 00 00 00 00 00 FC 3A 01 80 42 00 00 00 00 00 00 00 00 00 00 00 FD 64 01 80 42 00 00 00 00 00 00 00 00 00 00 00 FE 86 01 80 42 00 00 00 00 00 00 00 00 00 00 00 FF D8 The second to last byte seems to be a sequential number that starts over at 00 when it reaches FF. I have included the whole range from 00 to FF to make it easier to guess the crc/checksum method.

    Read the article

  • Ubuntu hard disk problem

    - by Henadzy
    Hello! I have got the error with a hard disk on Ubuntu 9.10. It slows down my system, applications have not been responding for a long time. But when I mount and use filesystem which placed on this hard disk at other computer it works properly. disk: SAMSUNG HD161HJ (SATA) syslog: Apr 25 00:28:25 vare6gin kernel: [ 885.773839] ata3.00: exception Emask 0x1 SAct 0x1e SErr 0x0 action 0x6 frozen Apr 25 00:28:25 vare6gin kernel: [ 885.773845] ata3.00: Ata error. fis:0x21 Apr 25 00:28:25 vare6gin kernel: [ 885.773861] ata3.00: cmd 60/08:08:3f:00:ad/00:00:10:00:00/40 tag 1 ncq 4096 in Apr 25 00:28:25 vare6gin kernel: [ 885.773864] res 51/40:24:67:c8:91/40:00:05:00:00/40 Emask 0x9 (media error) Apr 25 00:28:25 vare6gin kernel: [ 885.773871] ata3.00: status: { DRDY ERR } Apr 25 00:28:25 vare6gin kernel: [ 885.773877] ata3.00: error: { UNC } Apr 25 00:28:25 vare6gin kernel: [ 885.773890] ata3.00: cmd 60/18:10:9f:6b:ed/00:00:0e:00:00/40 tag 2 ncq 12288 in Apr 25 00:28:25 vare6gin kernel: [ 885.773893] res 51/40:24:67:c8:91/40:00:05:00:00/40 Emask 0x9 (media error) Apr 25 00:28:25 vare6gin kernel: [ 885.773900] ata3.00: status: { DRDY ERR } Apr 25 00:28:25 vare6gin kernel: [ 885.773904] ata3.00: error: { UNC } Apr 25 00:28:25 vare6gin kernel: [ 885.773918] ata3.00: cmd 60/08:18:3f:5f:ed/00:00:0e:00:00/40 tag 3 ncq 4096 in Apr 25 00:28:25 vare6gin kernel: [ 885.773921] res 51/40:24:67:c8:91/40:00:05:00:00/40 Emask 0x9 (media error) Apr 25 00:28:25 vare6gin kernel: [ 885.773927] ata3.00: status: { DRDY ERR } Apr 25 00:28:25 vare6gin kernel: [ 885.773932] ata3.00: error: { UNC } Apr 25 00:28:25 vare6gin kernel: [ 885.773946] ata3.00: cmd 60/08:20:67:c8:91/00:00:05:00:00/40 tag 4 ncq 4096 in Apr 25 00:28:25 vare6gin kernel: [ 885.773948] res 51/40:24:67:c8:91/40:00:05:00:00/40 Emask 0x9 (media error) Apr 25 00:28:25 vare6gin kernel: [ 885.773955] ata3.00: status: { DRDY ERR } Apr 25 00:28:25 vare6gin kernel: [ 885.773960] ata3.00: error: { UNC } Apr 25 00:28:25 vare6gin kernel: [ 885.773970] ata3: hard resetting link Apr 25 00:28:25 vare6gin kernel: [ 885.773974] ata3: nv: skipping hardreset on occupied port Apr 25 00:28:25 vare6gin kernel: [ 886.240073] ata3: SATA link up 3.0 Gbps (SStatus 123 SControl 300) Apr 25 00:28:25 vare6gin kernel: [ 886.256277] ata3.00: configured for UDMA/133 Apr 25 00:28:25 vare6gin kernel: [ 886.256305] ata3: EH complete Apr 25 00:28:27 vare6gin kernel: [ 888.176088] ata3: EH in SWNCQ mode,QC:qc_active 0xF sactive 0xF Apr 25 00:28:27 vare6gin kernel: [ 888.176099] ata3: SWNCQ:qc_active 0xF defer_bits 0x0 last_issue_tag 0x3 Apr 25 00:28:27 vare6gin kernel: [ 888.176102] dhfis 0xF dmafis 0x1 sdbfis 0x0 Apr 25 00:28:27 vare6gin kernel: [ 888.176109] ata3: ATA_REG 0x51 ERR_REG 0x40 Apr 25 00:28:27 vare6gin kernel: [ 888.176113] ata3: tag : dhfis dmafis sdbfis sacitve Apr 25 00:28:27 vare6gin kernel: [ 888.176120] ata3: tag 0x0: 1 1 0 1 Apr 25 00:28:27 vare6gin kernel: [ 888.176126] ata3: tag 0x1: 1 0 0 1 Apr 25 00:28:27 vare6gin kernel: [ 888.176131] ata3: tag 0x2: 1 0 0 1 Apr 25 00:28:27 vare6gin kernel: [ 888.176136] ata3: tag 0x3: 1 0 0 1

    Read the article

  • How can I take the first 100 characters of html content ( without stripping the TAGS! )

    - by Atomiton
    There are lots of questions on how to strip html tags, but not many on functions/methods to close them. Here's the situation. I have a 500 character Message summary ( which includes html tags ), but I only want the first 100 characters. Problem is if I truncate the message, it could be in the middle of an html tag... which messes up stuff. Assuming the html is something like this: <div class="bd">"Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. <br/> <br/>Some Dates: April 30 - May 2, 2010 <br/> <p>Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. <em>Duis aute irure dolor in reprehenderit</em> in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum. <br/> </p> For more information about Lorem Ipsum doemdloe, visit: <br/> <a href="http://www.somesite.com" title="Some Conference">Some text link</a><br/> </div> How would I take the first ~100 characters or so? ( Although, ideally that would be the first approximately 100 characters of "CONTENT" ( in between the html tags ) I'm assuming the best way to do this would be a recursive algorithm that keeps track of the html tags and appends any tags that would be truncated, but that may not be the best approach. My first thoughts are using recursion to count nested tags, and when we reach 100 characters, look for the next "<" and then use recursion to write the closing html tags needed from there. The reason for doing this is to make a short summary of existing articles without requiring the user to go back and provide summaries for all the articles. I want to keep the html formatting, if possible. NOTE: Please ignore that the html isn't totally semantic. This is what I have to deal with from my WYSIWYG. EDIT: I added a potential solution ( that seems to work ) I figure others will run into this problem as well. I'm not sure it's the best... and it's probably not totally robust ( in fact, I know it isn't ), but I'd appreciate any feedback

    Read the article

  • HTML encoding and decoding

    - by Zerotoinfinite
    Hi All, I am looking for a HTML editor, and I found many links through google like this http://online-html-editor.org/ Now I have written something on it: Let say the below content <div> <span style="font-weight: bold; font-size: 12pt; "> Heading</span></div> <div><br /> </div> The'la;skdlajlsdjansdkahskdkhaksdhkhaskdhkhaskhdkashdkhaksda <div>asdljalsjdljalsdjljalsdjljalsdjljalsdlajs;fl'ajduyasdahsldjkagsdhasvdjyhlasjdgklastgians,dkasjdlhakhsdl</div> <div>amsdka;sdlyasdalshdlj,asdh,asdjg,absdlasd/.malskdla'slduljds,vaskkd;jas;dl'asldu'alsdaskd;lk'as;d</div> <div>'a</div> <div>sd;jasldj;asdaklsdka'sld'sai'dkabskdm;;lsidaasfhdlasjd;ljaspodi;ajsd;lka'sld</div> <div>'</div> <div>ad'</div> <div>a;fj;ljas;dfjalshdoiauslkfdnkasfnlka's;dkap[sd'alsd;jlaksfdkajsdfh;alsd;</div> <div>asdkasjd;kaskd;as;dk;aksd;ajsdlkjalksjdlasjdkgasfkjashdjashdkasfdkjashkdasdjo[uipuhlkasdjlkajsdljalsjdlkajsdljaljsdljalsjdlkaslkjdlkasdjlasjdlkjaslkdjlasjdlasudqpeohw09iqwpekjqwehlj</div> <div><br /> </div> <div> <div> bool tt = new bool();</div> <div> if (txtStatus.Text == "true")</div> <div> tt = true;</div> <div> else</div> <div> tt = false;</div> <div><br /> </div> <div> </div></div> Now I want to save this content into the database and display as a normal text on a page. While extracting I can use Server.HTMLDecode, but I am facing problem while inserting this html data which I have copied from the sites. Please help. Thanks in advance.

    Read the article

  • C++: Implementing Named Pipes using the Win32 API

    - by Mike Pateras
    I'm trying to implement named pipes in C++, but either my reader isn't reading anything, or my writer isn't writing anything (or both). Here's my reader: int main() { HANDLE pipe = CreateFile(GetPipeName(), GENERIC_READ, 0, NULL, OPEN_EXISTING, FILE_FLAG_OVERLAPPED, NULL); char data[1024]; DWORD numRead = 1; while (numRead >= 0) { ReadFile(pipe, data, 1024, &numRead, NULL); if (numRead > 0) cout << data; } return 0; } LPCWSTR GetPipeName() { return L"\\\\.\\pipe\\LogPipe"; } And here's my writer: int main() { HANDLE pipe = CreateFile(GetPipeName(), GENERIC_WRITE, 0, NULL, OPEN_EXISTING, FILE_FLAG_OVERLAPPED, NULL); string message = "Hi"; WriteFile(pipe, message.c_str(), message.length() + 1, NULL, NULL); return 0; } LPCWSTR GetPipeName() { return L"\\\\.\\pipe\\LogPipe"; } Does that look right? numRead in the reader is always 0, for some reason, and it reads nothing but 1024 -54's (some weird I character). Solution: Reader (Server): while (true) { HANDLE pipe = CreateNamedPipe(GetPipeName(), PIPE_ACCESS_INBOUND | PIPE_ACCESS_OUTBOUND , PIPE_WAIT, 1, 1024, 1024, 120 * 1000, NULL); if (pipe == INVALID_HANDLE_VALUE) { cout << "Error: " << GetLastError(); } char data[1024]; DWORD numRead; ConnectNamedPipe(pipe, NULL); ReadFile(pipe, data, 1024, &numRead, NULL); if (numRead > 0) cout << data << endl; CloseHandle(pipe); } Writer (client): HANDLE pipe = CreateFile(GetPipeName(), GENERIC_READ | GENERIC_WRITE, 0, NULL, OPEN_EXISTING, 0, NULL); if (pipe == INVALID_HANDLE_VALUE) { cout << "Error: " << GetLastError(); } string message = "Hi"; cout << message.length(); DWORD numWritten; WriteFile(pipe, message.c_str(), message.length(), &numWritten, NULL); return 0; The server blocks until it gets a connected client, reads what the client writes, and then sets itself up for a new connection, ad infinitum. Thanks for the help, all!

    Read the article

  • How do i pass arraycollection to Advancedatagrid using HierarchicalData ?

    - by R.Vijayakumar
    Problem with passing arraycollection to Advance datagrid. My Arraycollection structure like `   private var groupList:ArrayCollection = new ArrayCollection([ {Country:'India', children:[ {Country:'Series1', children:[                                {Matches:'India Test series 1',isEnable:false,id:1,isSelected:true},                                {Matches:'India Test series 2',isEnable:false,id:2,isSelected:true},                                {Matches:'India Test series 3',isEnable:false,id:3,isSelected:true}]},              {Country:'Series2', children:[                                {Matches:'Australia Test series 1',isEnable:false,id:25,isSelected:true},                                {Matches:'Australia Test series 2',isEnable:false,id:26,isSelected:true},                                {Matches:'Australia Test series 3',isEnable:false,id:27,isSelected:true}]} ]}, {Country:'Austrila', children:[ {Country:'Series1', children:[                                {Matches:'Australia Test series 1',isEnable:false,id:46,isSelected:true},                                {Matches:'Australia Test series 2',isEnable:false,id:47,isSelected:true},                                {Matches:'Australia Test series 3',isEnable:false,id:48,isSelected:true}]}, {Country:'Series2', children:[                                {Matches:'Australia Test series 1',isEnable:false,id:49,isSelected:true},                                {Matches:'Australia Test series 2',isEnable:false,id:50,isSelected:true},                                {Matches:'Australia Test series 3',isEnable:false,id:51,isSelected:true}]}, {Country:'Series3', children:[                                {Matches:'Australia Test series 1',isEnable:false,id:52,isSelected:true},                                {Matches:'Australia Test series 2',isEnable:false,id:53,isSelected:true},                                {Matches:'Australia Test series 3',isEnable:false,id:54,isSelected:true}]} ]} passing AD in dataProvider="{new HierarchicalData(groupList)}" It's working fine. it's show two menu of tree and childrens based on country .But i tried dynamic xml convert to Arraycollection by below code private function convertXmlToArrayCollection( file:String ):ArrayCollection { var xml:XMLDocument = new XMLDocument( file ); //var decoder:SimpleXMLDecoder = new SimpleXMLDecoder(); var decoder1:SimpleXMLDecoder = new SimpleXMLDecoder(true); var data1:Object = decoder1.decodeXML( xml ); var array1:Array = ArrayUtil.toArray(data1); return new ArrayCollection( array1 ); } my xml structure is <Country Country="India "> <Country Country="Series "> <Matches Matches="BIndependiente-Colon" id="701536" isEnable="false" isSelected="true" startDate="2009-10-29 01:30:00" EndDate="2009-10-29 01:30:00"/> <Matches Matches="Boca Juniors-Chacarita Juniors" id="701633" isEnable="false" isSelected="true" startDate="2009-10-29 19:00:00" EndDate=""/> </Country> </Country> <Country Country="Australia"> <Country Country="series"> <Matches Matches="BIndependiente-Colon" id="701536" isEnable="false" isSelected="true" startDate="2009-10-29 01:30:00" EndDate="2009-10-29 01:30:00"/> <Matches Matches="Boca Juniors-Chacarita Juniors" id="701633" isEnable="false" isSelected="true" startDate="2009-10-29 19:00:00" EndDate=""/> </Country> </Country> So if i tried to convert this format of xml code to arryacollection , it converted the array collection but when will i pass to Advance data grid it not show any result . What did i wrong ? groupList1= convertXmlToArrayCollection(string1); Alert.show(groupList1[0].Country[0].Matches[0].id.toString());// output is =701536 Where did i mistake it ? Plz kindly any one refer me , What will i changed ?

    Read the article

  • Good working habits to observe in project development?

    - by Will Marcouiller
    As my development experience grows, I see fit to stick to best practices from here and there to build somehow my own working practices while observing the conventions, etc. I'm currently working on a project which my goals is to graduate the security access model from an environment's Active Directory to another environment's automatically. I don't know for any of you, but as far as I'm concerned, I meet some real difficulties sticking to only one way, then develop. I mean, I learn something new everyday while visiting SO, and recently wanted to get acquainted with generics. On the other hand, I better know the Façade pattern which proved to be very practical in transactional programming in process systems. This seems to be less practical for desktop application as there are plenty of variables to consider in a desktop application that you don't have to care in transactional programming, as you're playing only with information data. As for my current project, I have: Groups; Organizational Units; Users. Which are all considered an entry in the Active Directory. This points out to be a good candidate for generics, as also approached this way by Bart de Smett's Linq to AD on CodePlex. He has a DirectorySource<T>, and to manage let's say groups, then he instantiate a source with the proper type: var groups = new DirectorySource<Group>(); This seems to be very a good way of doing. Despite, I seem to go from one pattern to another and I don't seem to be able to strictly stick to one. While I'm aware that one must not stay with only one way of doing, since each pattern statisfies certain advantages, while also illustrating disadvantages under some usage conditions, I seem to want to develop with both patterns having a singleton Façade class with the underlying factories which represent the sub systems: GroupsFactory; UsersFactory; OrganizationalUnitsFactory. Each of the factories offers the possible operations for their respective entity (group, user, OU). To make a very long story short, I often have plenty of ideas while developping and this causes me some trouble, as I go from an idea to another feeling completely lost after a while. Yet I understand the advantages and disavantages, I have no trouble choosing from one pattern to another depending on the situation. Nevertheless, when it comes to programming itself, if I'm not part of a team, I feel sometimes like I can't do anything good. That is, because I can't stand not doing something "perfect" the first time. The role I play within the project is both: the project manager and the programmer. I am more comfortable in the project manager role, architectural role, analytical role than the developer's. Has any of you some good habbits to observe in project development? Thanks to you all! =)

    Read the article

  • Web Services Primer for a WinForms Developer?

    - by Unicorns
    I've been writing client/server applications with Winforms for about six years now, but I have yet to venture into the web space (neither ASP.NET nor web services). Given the direction that the job market has been heading for some time and the fact that I have a basic curiosity, I'd like to get involved with writing web services, but I don't know where to start. I've read about various options (XML/SOAP vs. JSON, REST vs...well, actually I don't know what it's called, etc.), but I'm not sure what sort of criteria are in play when making the determination to use one or the other. Obviously, I'd like to leverage the tools that I have (Visual Studio, the .NET framework, etc.) without hamstringing myself into only targeting a particular audience (i.e. writing the service in such a way as to make it difficult to consume from a Windows Mobile/Android/iPhone client, for example). For the record, my plan--for now--is to use WCF for my web service development, but I'm open to using another .NET approach if that's advisable. I realize that this question is pretty open-ended so it may get closed, but here are some things I'm wondering: What are some things to consider when choosing the type of web service (REST, etc.) I intend to write? Is it possible (and, if so, feasible) to move from one approach to another? Can web services be written in an event-driven way? As I said I'm a Winforms developer, so I'm used to objects raising events for me to react to. For instance, if I have two clients connected to my service, is there a way for me to "push" information to one of them as a result of an action by the other? If this is possible, is this advisable or am I just not thinking about it correctly? What authentication mechanisms seem to work best for public-facing services? What about if I plan to have different types of OS'es and clients connecting to the service? Is there a generally accepted platform-agnostic approach? In the line of authentication, is this something that I should be doing myself (authenticating an managing sessions, etc.) or is this something should be handled at the framework level and I just define exactly how it should work? If that's the case, how do I tell who the requester has authenticated themselves as? I started writing an authentication mechanism (simple username/password combinations stored in the database and a corresponding session table with a GUID key) within my service and just requiring that key to be passed with every operation (other than logging in, of course), but I want to make sure that I'm not reinventing the wheel here. However, I also don't want to clutter up the server with a bunch of machine user accounts just to use Basic authentication. I'm also under the impression that Digest (and of course Windows) authentication requires a machine (or AD) user account.

    Read the article

  • Merge Mutliple Excel Workbooks

    - by IRHM
    I wonder whether someone may be able to help me please. I'm trying to use the code below to allow the user to select multiple Excel Workbooks, amalgamating the data into one 'Summary' sheet. Sub Merge() Dim DestWB As Workbook, WB As Workbook, WS As Worksheet, SourceSheet As String Set DestWB = ActiveWorkbook SourceSheet = "Input" startrow = 7 FileNames = Application.GetOpenFilename( _ filefilter:="Excel Files (*.xls*),*.xls*", _ Title:="Select the workbooks to merge.", MultiSelect:=True) If IsArray(FileNames) = False Then If FileNames = False Then Exit Sub End If End If For n = LBound(FileNames) To UBound(FileNames) Set WB = Workbooks.Open(Filename:=FileNames(n), ReadOnly:=True) For Each WS In WB.Worksheets If WS.Name = SourceSheet Then With WS If .UsedRange.Cells.Count > 1 Then dr = DestWB.Worksheets("Input").Range("C" & Rows.Count).End(xlUp).Row + 1 lastrow = .Range("C" & Rows.Count).End(xlUp).Row For j = lastrow To startrow Step -1 Select Case .Range("E" & j).Value Case "Manager", "Lead", "Technical", "Analyst" 'do nothing Case Else .Rows(j).EntireRow.Delete End Select Next lastrow = .Range("C" & Rows.Count).End(xlUp).Row If lastrow >= startrow Then .Range("B" & startrow & ":AD" & lastrow).Copy DestWB.Worksheets("Input").Cells(dr, "B").PasteSpecial xlValues .Range("AF" & startrow & ":AQ" & lastrow).Copy DestWB.Worksheets("Input").Cells(dr, "AF").PasteSpecial xlValues .Range("AS" & startrow & ":AS" & lastrow).Copy DestWB.Worksheets("Input").Cells(dr, "AS").PasteSpecial xlValues End If End If End With Exit For End If Next WS WB.Close savechanges:=False Next n End Sub The code works fine except for one issue which I've been trying to solve for the last few weeks. The following line of code looks in column E of the Source file, and if any of the entries match the values shown in the code it copies that row of data to paste into the Destination file. If Range("E" & j) <> "Manager" And Range("E" & j) <> "Lead" And Range("E" & j) <> "Technical" And Range("E" & j) <> "Analyst" Then Rows(j).Delete The problem I have is that if none of these values are found in the Source file, I receive the following error: Run time error '1004': Delete method of range class failed and in Debug mode it highlights this part of the line as the source of the error, but I've no idea why. Rows(j).Delete I just wondered whether someone may be able to look at this please and let me know where I'm going wrong, or perhaps even suggest a more efficient process of allowing the user to merge the workbooks. Many thanks and kind regards

    Read the article

  • Modify values on-the-fly during SqlAdapter.Fill( )

    - by Timothy
    What would the proper way be to modify values on the fly as they are loaded into a DataTable by SqlAdapter.Fill()? I have globalized my application's log messages. An integer indicating the event type and serialized data relevant to the event is stored in the database as show below. When I display the logged events through a DataGridView control to the user, I interpolate the data to a formatting string. event_type event_timestamp event_details ============================================ 3 2010-05-04 20:49:58 jsmith 1 2010-05-04 20:50:42 jsmith ... I am currently iterating through the DataTable's rows to format the messages. public class LogDataTable : DataTable { public LogDataTable() { Locale = CultureInfo.CurrentCulture; Columns.AddRange(new DataColumn[] { new DataColumn("event_type", typeof(Int32)), new DataColumn("event_timestamp", typeof(DateTime)), new DataColumn("event_details", typeof(String))}); } } ... using (SqlDataAdapter adapter = new SqlDataAdapter(...)) { adapter.SelectCommand.Parameters.AddRange(new Object[] { ... }); adapter.Fill(table); } foreach (DataRow row in table.Rows) { switch ((LogEventType)row["event_type"]) { case LogEventType.Create: row["event_details"] = String.Format(Resources.Strings.LogEventCreateMsg, row["event_details"]; break; case LogEventType.Create: row["event_details"] = String.Format(Resources.Strings.LogEventCreateMsg, row["event_details"]; break; ... The end result as displayed would resemble: Type Date and Time Details ==================================================================== [icon] 2010-05-04 20:49:58 Failed login attempt with username jsmith [icon] 2010-05-04 20:50:42 Successful login with username jsmith ... It seems wasteful to iterate the result set twice-- once as the table is filled by the adapter, and again to perform the replacements. I would really like to do the replacement on-the-fly in my LogDataTable class as it is being populated. I have tried overriding an OnRowChanging method in LogDataTable, which throws an InRowChangingEventException. protected override void OnRowChanging(DataRowChangeEventArgs e) { base.OnRowChanging(e); switch ((LogEventType)row["event_type"]) ... I have tried overriding an OnRowChanged method, which throws a StackOverflowException (I assume changing it re-triggers the method ad infinitum?). I have tried overriding an OnTableNewRow method, which does not throw an exception but appears not to be invoked (I assume only when a user adds a row in the view, which I've prevented). I'd greatly appreciate any assistance anyone can give me.

    Read the article

  • Image animation problem in silverlight

    - by Jak
    Hi followed " http://www.switchonthecode.com/tutorials/silverlight-3-tutorial-planeprojection-and-perspective-3d#comment-4688 ".. the animation is working fine. I am new to silver light. when i use dynamic image from xml instead of static image as in tutorial,.. it is not working fine, please help me on this. i used list box.. for this animation effect do i need to change listbox to some other arrangement ? if your answer yes means, pls give me some sample code. Thanks in advance. Xaml code: <ListBox Name="listBox1"> <ListBox.ItemTemplate> <DataTemplate> <StackPanel> <Image Source="{Binding imgurl}" HorizontalAlignment="Left" Name="image1" Stretch="Fill" VerticalAlignment="Top" MouseLeftButtonUp="FlipImage" /> </StackPanel> </DataTemplate> </ListBox.ItemTemplate> </ListBox> My C# code: //getting image URL from xml XElement xmlads = XElement.Parse(e.Result); //i bind the url in to listBox listBox1.ItemsSource = from ads in xmlads.Descendants("ad") select new zestItem { imgurl = ads.Element("picture").Value }; public class zestItem { public string imgurl { get; set; } } private int _zIndex = 10; private void FlipImage(object sender, MouseButtonEventArgs e) { Image image = sender as Image; // Make sure the image is on top of all other images. image.SetValue(Canvas.ZIndexProperty, _zIndex++); // Create the storyboard. Storyboard flip = new Storyboard(); // Create animation and set the duration to 1 second. DoubleAnimation animation = new DoubleAnimation() { Duration = new TimeSpan(0, 0, 1) }; // Add the animation to the storyboard. flip.Children.Add(animation); // Create a projection for the image if it doesn't have one. if (image.Projection == null) { // Set the center of rotation to -0.01, which will put a little space // between the images when they're flipped. image.Projection = new PlaneProjection() { CenterOfRotationX = -0.01 }; } PlaneProjection projection = image.Projection as PlaneProjection; // Set the from and to properties based on the current flip direction of // the image. if (projection.RotationY == 0) { animation.To = 180; } else { animation.From = 180; animation.To = 0; } // Tell the animation to animation the image's PlaneProjection object. Storyboard.SetTarget(animation, projection); // Tell the animation to animation the RotationYProperty. Storyboard.SetTargetProperty(animation, new PropertyPath(PlaneProjection.RotationYProperty)); flip.Begin(); }

    Read the article

  • Symfony2 Forms: is it possible to bind a form in an "unconventional way"?

    - by DonCallisto
    Imagine this scenario: in our company there is an employee that "play" around graphic,css,html and so on. Our new project will born under symfony2 so we're trying some silly - but "real" - stuff (like authentication from db, submit data from a form and persist it to db and so on..) The problem As far i know, learnt from symfony2 "book" that i found on the site (you can find it here), there is an "automated" way for creating and rendering forms: 1) Build the form up into a controller in this way $form = $this->createFormBuilder($task) ->add('task','text'), ->add('dueDate','date'), ->getForm(); return $this->render('pathToBundle:Controller:templateTwig', array('form'=>$form->createview()); 2) Into templateTwig render the template {{ form_widget(form) }} // or single rows method 3) Into a controller (the same that have a route where you can submit data), take back submitted information if($rquest->getMethod()=='POST'){ $form->bindRequest($request); /* and so on */ } Return to scenario Our graphic employee don't want to access controllers, write php and other stuff like those. So he'll write a twig template with a "unconventional" (from symfony2 point of view, but conventional from HTML point of view) method: /* into twig template */ <form action="{{ path('SestanteUserBundle_homepage') }}" method="post" name="userForm"> <div> USERNAME: <input type="text" name="user_name" value="{{ user.username}}"/> </div> <div> EMAIL: <input type="text" name="user_mail" value="{{ user.email }}"/> </div> <input type="hidden" name="user_id" value="{{ id }}" /> <input type="submit" value="modifica i dati"> </form> Now, if into the controller that handle the submission of data we do something like that public function indexAction(Request $request) { if($request->getMethod() == 'POST'){ // sono arrivato per via di un submit, quindi devo modificare i dati prima di farli vedere a video $defaultData = array('message'=>'ho visto questa cosa in esempio, ma non capisco se posso farne a meno'); $form = $this->createFormBuilder($defaultData) ->add('user_name','text') ->add('user_mail','email') ->add('user_id','integer') ->getForm(); $form->bindRequest($request); //bindo la form ad una request $data = $form->getData(); //mi aspetto un'array chiave=>valore /* .... */ We expected that $data will contain an array with key,value from the submitted form. We found that it isn't true. After googling for a while and try with other "bad" ideas, we're frozen into that. So, if you have a "graphic office" that can't handle directly php code, how can we interface from form(s) to controller(s) ? UPDATE It seems that Symfony2 use a different convention for form's field name and lookup once you've submitted that. In particular, if my form's name is addUser and a field is named userName, the field's name will be AddUser[username] so maybe it have a "dynamic" lookup method that will extract form's name, field's name, concat them and lookup for values. Is it possible?

    Read the article

  • Intent filter for browsing XML (specifically rss) in android

    - by Leif Andersen
    I have an activity that I want to run every time the user goes to an xml (specifically rss) page in the browser (at least assuming the user get's it from the list of apps that can support it). I currently already have the current intent filter: <activity android:name=".activities.EpisodesListActivity" android:theme="@android:style/Theme.NoTitleBar"> <intent-filter> <category android:name="android.intent.category.BROWSABLE"></category> <category android:name="android.intent.category.DEFAULT"></category> <action android:name="android.intent.action.VIEW"></action> <data android:scheme="http"></data> </intent-filter> </activity> Now as you can guess, this is an evil intent, as it wants to open whenever a page is requested via http. However, when I ad the line: <data android:mimeType="application/rss+xml"></data> to make it: <activity android:name=".activities.EpisodesListActivity" android:theme="@android:style/Theme.NoTitleBar"> <intent-filter> <category android:name="android.intent.category.BROWSABLE"></category> <category android:name="android.intent.category.DEFAULT"></category> <action android:name="android.intent.action.VIEW"></action> <data android:scheme="http"></data> <data android:mimeType="application/rss+xml"></data> </intent-filter> </activity> The application no longer claims to be able to run rss files. Also, if I change the line to: <data android:mimeType="application/xml"></data> It also won't work (for generic xml file even). So what intent filter do I need to make in order to claim that the activity supports rss. (Also, bonus points if you can tell me how I know what URL it was the user opened. So far, I've always sent that information from one activity to the other using extras). Thank you for your help

    Read the article

  • How to make item view render rich (html) text in PyQt?

    - by Giorgio Gelardi
    I'm trying to translate code from this thread in python: import sys from PyQt4.QtCore import * from PyQt4.QtGui import * __data__ = [ "Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.", "Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat.", "Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.", "Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum." ] def get_html_box(text): return '''<table border="0" width="100%"><tr width="100%" valign="top"> <td width="1%"><img src="softwarecenter.png"/></td> <td><table border="0" width="100%" height="100%"> <tr><td><b><a href="http://www.google.com">titolo</a></b></td></tr> <tr><td>{0}</td></tr><tr><td align="right">88/88/8888, 88:88</td></tr> </table></td></tr></table>'''.format(text) class HTMLDelegate(QStyledItemDelegate): def paint(self, painter, option, index): model = index.model() record = model.listdata[index.row()] doc = QTextDocument(self) doc.setHtml(get_html_box(record)) doc.setTextWidth(option.rect.width()) painter.save() ctx = QAbstractTextDocumentLayout.PaintContext() ctx.clip = QRectF(0, option.rect.top(), option.rect.width(), option.rect.height()) dl = doc.documentLayout() dl.draw(painter, ctx) painter.restore() def sizeHint(self, option, index): model = index.model() record = model.listdata[index.row()] doc = QTextDocument(self) doc.setHtml(get_html_box(record)) doc.setTextWidth(option.rect.width()) return QSize(doc.idealWidth(), doc.size().height()) class MyListModel(QAbstractListModel): def __init__(self, parent=None, *args): super(MyListModel, self).__init__(parent, *args) self.listdata = __data__ def rowCount(self, parent=QModelIndex()): return len(self.listdata) def data(self, index, role=Qt.DisplayRole): return index.isValid() and QVariant(self.listdata[index.row()]) or QVariant() class MyWindow(QWidget): def __init__(self, *args): super(MyWindow, self).__init__(*args) # listview self.lv = QListView() self.lv.setModel(MyListModel(self)) self.lv.setItemDelegate(HTMLDelegate(self)) self.lv.setResizeMode(QListView.Adjust) # layout layout = QVBoxLayout() layout.addWidget(self.lv) self.setLayout(layout) if __name__ == "__main__": app = QApplication(sys.argv) w = MyWindow() w.show() sys.exit(app.exec_()) Element's size and position are not calculated correctly I guess, perhaps because I haven't understand at all the style related parts from original code. Can someone help me?

    Read the article

< Previous Page | 105 106 107 108 109 110 111 112 113 114 115 116  | Next Page >