Search Results

Search found 75233 results on 3010 pages for 'data distribution service'.

Page 571/3010 | < Previous Page | 567 568 569 570 571 572 573 574 575 576 577 578  | Next Page >

  • Very Cool &ndash; Miami 311 System for tracking citizen service requests (Windows Azure, Silverlight

    - by Jim Duffy
    Having grown up in South Florida this short, but very enlightening, video explaining how the City of Miami has implemented a 311 citizen service request system using Windows Azure, Silverlight and Bing Maps definitely caught my attention. Miami311 The Miami311 System is a Windows Azure/Silverlight-based solution which enables City of Miami citizens report and track issues reported to city management. The system uses Bing Maps to plot the location and relevant information about each issue reported. Citizens now have the ability to easily see the status of the issue without having to call the city office. What I found interesting were a couple of benefits that a metropolitan area such as Miami can take advantage of in Windows Azure cloud-based solution. For the city of Miami, both benefits center around the weather. Of course the threat of a hurricane is a real issue in South Florida and what better way to make sure your site stays up during a hurricane then to have the site hosted far away from the eye of the storm. Using a Windows Azure cloud-based architecture the City of Miami is able to host the application within the Microsoft data centers safely away from any hurricane passing through South Florida. The second benefit is the inherent scalability of a Windows Azure based solution. During a severe weather event like thunderstorms or even worse, a hurricane, downed trees and power lines are a commonly reported problem. Being able to quickly scale up the computing resources required to handle the spike in citizens reporting these types of problems on the site is a huge benefit. Once the weather event has passed and downed tree reports begin to subside they can quickly reverse the process and scale the system back down to pre-storm levels. It’s kind of day-to-day kind of stuff but very cool stuff nonetheless. Have a day. :-|

    Read the article

  • Why do my Application Compatibility Toolkit Data Collectors fail to write to my ACT Log Share?

    - by Jay Michaud
    I am trying to get the Microsoft Application Compatibility Toolkit 5.6 (version 5.6.7320.0) to work, but I cannot get the Data Collectors to write to the ACT Log Share. The configuration is as follows. Machine: ACT-Server Domain: mydomain.example.com OS: Windows 7 Enterprise 64-bit Edition Windows Firewall configuration: File and Printer Sharing (SMB-In) is enabled for Public, Domain, and Private networks ACT Log Share: ACT Share permissions*: Group/user names Allow permissions --------------------------------------- Everyone Full Control Administrator Full Control Domain Admins Full Control Administrators Full Control ANONYMOUS LOGON Full Control Folder permissions*: Group/user name Allow permissions Apply to ------------------------------------------------- ANONYMOUS LOGON Read, write & execute This folder, subfolders, and files Domain Admins Full control This folder, subfolders, and files Everyone Read, write & execute This folder, subfolders, and files Administrators Full control This folder, subfolders, and files CREATOR OWNER Full control Subfolders and files SYSTEM Full control This folder, subfolders, and files INTERACTIVE Traverse folder / This folder, subfolders, and files execute file, List folder / read data, Read attributes, Read extended attributes, Create files / write data, Create folders / append data, Write attributes, Write extended attributes, Delete subfolders and files, Delete, Read permissions SERVICE (same as INTERACTIVE) BATCH (same as INTERACTIVE) *I am fully aware that these permissions are excessive, but that is beside the point of this question. Some of the clients running the Data Collector are domain members, but some are not. I am working under the assumption that this is a Windows file sharing permission issue or a network access policy issue, but of course, I could be wrong. It is my understanding that the Data Collector runs in the security context of the SYSTEM account, which for domain members appears on the network as MYDOMAIN\machineaccount. It is also my understanding from reading numerous pieces of documentation that setting the ANONYMOUS LOGON permissions as I have above should allow these computer accounts and non-domain-joined computers to access the share. To test connectivity, I set up the Windows XP Mode virtual machine (VM) on ACT-Server. In the VM, I opened a command prompt running as SYSTEM (using the old "at" command trick). I used this command prompt to run explorer.exe. In this Windows Explorer instance, I typed \ACT-Server\ACT into the address bar, and then I was prompted for logon credentials. The goal, though, was not to be prompted. I also used the "net use /delete" command in the command prompt window to delete connections to the ACT-Server\IPC$ share each time my connection attempt failed. I have made sure that the appropriate exceptions are Since ACT-Server is a domain member, the "Network access: Sharing and security model for local accounts" security policy is set to "Classic - local users authenticate as themselves". In spite of this, I still tried enabling the Guest account and adding permissions for it on the share to no effect. What am I missing here? How do I allow anonymous logons to a shared folder as a step toward getting my ACT Data Collectors to deposit their data correctly? Am I even on the right track, or is the issue elsewhere?

    Read the article

  • A Generic RIDC Test Program

    - by Kevin Smith
    Many times I have found it useful to use a java program that communicates with WebCenter Content (WCC) using RIDC for testing. I might not have access to the web GUI or need to test a service running as a specific user. In the past I had created a number of "one off" programs that submitted specific services, e.g GET_SEARCH_RESULTS, DOCINFO, etc. Recently I decided to create a generic RIDC test program that could submit any service with the desired parameters based on a configuration file. The programs gets the following information from the configuration file: WCC connection information (host, port) User to use to run service Service to run Any parameters for the service The program will make a connection to the WCC server, send the service request, and print the results of the service call using the getResponseAsString() method. Here is a sample configuration file: ridc.host=localhostridc.port=4444ridc.user=sysadminridc.idcservice=GET_SEARCH_RESULTSidcservice.QueryText=dDocType <matches> `Document`idcservice.SortField=dDocNameidcservice.SortDesc=ASC There is a readme file included in the zip with instructions for how to configure and run the program. The program takes one command line argument, the configuration file name. The configuration file name is optional and defaults to config.properties. If you have any suggestions for improvements let me know. Right now it only submits a single service call each time you run it. One enhancement I have already thought about would be to allow you to specify multiple services to tun in the configuration file. You can do that with the current program by having multiple configuration files and running the program multiple times, each with a different configuration file. You can download the program here.

    Read the article

  • BI Applications overview

    - by sv744
    Welcome to Oracle BI applications blog! This blog will talk about various features, general roadmap, description of functionality and implementation steps related to Oracle BI applications. In the first post we start with an overview of the BI apps and will delve deeper into some of the topics below in the upcoming weeks and months. If there are other topics you would like us to talk about, pl feel free to provide feedback on that. The Oracle BI applications are a set of pre-built applications that enable pervasive BI by providing role-based insight for each functional area, including sales, service, marketing, contact center, finance, supplier/supply chain, HR/workforce, and executive management. For example, Sales Analytics includes role-based applications for sales executives, sales management, as well as front-line sales reps, each of whom have different needs. The applications integrate and transform data from a range of enterprise sources—including Siebel, Oracle, PeopleSoft, SAP, and others—into actionable intelligence for each business function and user role. This blog  starts with the key benefits and characteristics of Oracle BI applications. In a series of subsequent blogs, each of these points will be explained in detail. Why BI apps? Demonstrate the value of BI to a business user, show reports / dashboards / model that can answer their business questions as part of the sales cycle. Demonstrate technical feasibility of BI project and significantly lower risk and improve success Build Vs Buy benefit Don’t have to start with a blank sheet of paper. Help consolidate disparate systems Data integration in M&A situations Insulate BI consumers from changes in the OLTP Present OLTP data and highlight issues of poor data / missing data – and improve data quality and accuracy Prebuilt Integrations BI apps support prebuilt integrations against leading ERP sources: Fusion Applications, E- Business Suite, Peoplesoft, JD Edwards, Siebel, SAP Co-developed with inputs from functional experts in BI and Applications teams. Out of the box dimensional model to source model mappings Multi source and Multi Instance support Rich Data Model    BI apps have a very rich dimensionsal data model built over 10 years that incorporates best practises from BI modeling perspective as well as reflect the source system complexities  Thanks for reading a long post, and be on the lookout for future posts.  We will look forward to your valuable feedback on these topics as well as suggestions on what other topics would you like us to cover. I Conformed dimensional model across all business subject areas allows cross functional reporting, e.g. customer / supplier 360 Over 360 fact tables across 7 product areas CRM – 145, SCM – 47, Financials – 28, Procurement – 20, HCM – 27, Projects – 18, Campus Solutions – 21, PLM - 56 Supported by 300 physical dimensions Support for extensive calendars; Gregorian, enterprise and ledger based Conformed data model and metrics for real time vs warehouse based reporting  Multi-tenant enabled Extensive BI related transformations BI apps ETL and data integration support various transformations required for dimensional models and reporting requirements. All these have been distilled into common patterns and abstracted logic which can be readily reused across different modules Slowly Changing Dimension support Hierarchy flattening support Row / Column Hybrid Hierarchy Flattening As Is vs. As Was hierarchy support Currency Conversion :-  Support for 3 corporate, CRM, ledger and transaction currencies UOM conversion Internationalization / Localization Dynamic Data translations Code standardization (Domains) Historical Snapshots Cycle and process lifecycle computations Balance Facts Equalization of GL accounting chartfields/segments Standardized values for categorizing GL accounts Reconciliation between GL and subledgers to track accounted/transferred/posted transactions to GL Materialization of data only available through costly and complex APIs e.g. Fusion Payroll, EBS / Fusion Accruals Complex event Interpretation of source data – E.g. o    What constitutes a transfer o    Deriving supervisors via position hierarchy o    Deriving primary assignment in PSFT o    Categorizing and transposition to measures of Payroll Balances to specific metrics to support side by side comparison of measures of for example Fixed Salary, Variable Salary, Tax, Bonus, Overtime Payments. o    Counting of Events – E.g. converting events to fact counters so that for example the number of hires can easily be added up and compared alongside the total transfers and terminations. Multi pass processing of multiple sources e.g. headcount, salary, promotion, performance to allow side to side comparison. Adding value to data to aid analysis through banding, additional domain classifications and groupings to allow higher level analytical reporting and data discovery Calculation of complex measures examples: o    COGs, DSO, DPO, Inventory turns  etc o    Transfers within a Hierarchy or out of / into a hierarchy relative to view point in hierarchy. Configurability and Extensibility support  BI apps offer support for extensibility for various entities as automated extensibility or part of extension methodology Key Flex fields and Descriptive Flex support  Extensible attribute support (JDE)  Conformed Domains ETL Architecture BI apps offer a modular adapter architecture which allows support of multiple product lines into a single conformed model Multi Source Multi Technology Orchestration – creates load plan taking into account task dependencies and customers deployment to generate a plan based on a customers of multiple complex etl tasks Plan optimization allowing parallel ETL tasks Oracle: Bit map indexes and partition management High availability support    Follow the sun support. TCO BI apps support several utilities / capabilities that help with overall total cost of ownership and ensure a rapid implementation Improved cost of ownership – lower cost to deploy On-going support for new versions of the source application Task based setups flows Data Lineage Functional setup performed in Web UI by Functional person Configuration Test to Production support Security BI apps support both data and object security enabling implementations to quickly configure the application as per the reporting security needs Fine grain object security at report / dashboard and presentation catalog level Data Security integration with source systems  Extensible to support external data security rules Extensive Set of KPIs Over 7000 base and derived metrics across all modules Time series calculations (YoY, % growth etc) Common Currency and UOM reporting Cross subject area KPIs (analyzing HR vs GL data, drill from GL to AP/AR, etc) Prebuilt reports and dashboards 3000+ prebuilt reports supporting a large number of industries Hundreds of role based dashboards Dynamic currency conversion at dashboard level Highly tuned Performance The BI apps have been tuned over the years for both a very performant ETL and dashboard performance. The applications use best practises and advanced database features to enable the best possible performance. Optimized data model for BI and analytic queries Prebuilt aggregates& the ability for customers to create their own aggregates easily on warehouse facts allows for scalable end user performance Incremental extracts and loads Incremental Aggregate build Automatic table index and statistics management Parallel ETL loads Source system deletes handling Low latency extract with Golden Gate Micro ETL support Bitmap Indexes Partitioning support Modularized deployment, start small and add other subject areas seamlessly Source Specfic Staging and Real Time Schema Support for source specific operational reporting schema for EBS, PSFT, Siebel and JDE Application Integrations The BI apps also allow for integration with source systems as well as other applications that provide value add through BI and enable BI consumption during operational decision making Embedded dashboards for Fusion, EBS and Siebel applications Action Link support Marketing Segmentation Sales Predictor Dashboard Territory Management External Integrations The BI apps data integration choices include support for loading extenral data External data enrichment choices : UNSPSC, Item class etc. Extensible Spend Classification Broad Deployment Choices Exalytics support Databases :  Oracle, Exadata, Teradata, DB2, MSSQL ETL tool of choice : ODI (coming), Informatica Extensible and Customizable Extensible architecture and Methodology to add custom and external content Upgradable across releases

    Read the article

  • Software to monitor bill payment to mission critical IT service providers (ISP, DNS etc.)

    - by Sholom
    Hi All, The Problem: Our very likable but absent minded bookkeeper keeps neglecting to pay our IT vendors on time. Just this past week our internet service was disconnected. Same could happen to many other mission critical accounts (domain registrar, backup MX, anti-virus license, HackerSafe (McAfee secure) service and even an 800 number to name a few). As the sysadmin, i monitor my severs to make sure they are plugged into the power-outlet. I believe i should also monitor my services to make sure they are plugged in to their money-outlet. To compound the problem, when the power goes out someone else will likely notice and notify me. But if a bill is not payed, no one will ever notice until service is lost. Lost as in losing our domain name which would cause a lot more damage then the power failing on our server. [Solution] = [Doesn't work because]: Retrain the bookkeeper = Wishful thinking. Notify my manager = Already have (via email). Protects me, does not solve problem. Fire bookkeeper = What makes you so sure the next one will never forget? Bottom line: Humans are humans and sooner or later something critical will be royally messed up. We need to partner with a machine to help us out here. Anybody have the same problem? What software/solution do you use? I would like software that emails me when a bill is passed due just like i get an email when the power outlet fails. Anyone hear of anything like that? Thanks

    Read the article

  • What could be the best way to generalize data from Facebook and Twitter?

    - by Sjaak van der Heide
    I am not sure if this is the best subsite to ask this question, but I'm pretty sure it doesn't fit on the normal or facebook SO page... I've been asked to make a general API for connecting to several Social Media platforms (at the moment Facebook and Twitter). I have already realised both of them seperately. Meaning I retrieve the data I need from both Facebook and Twitter and hold the data in it's own dataclass. In my case a list of FacebookTimelineItems and a list of TwitterTimelineItems. now the hard part is taking the parts that are used in both (username, id, message and such) and make 1 general class that is eventually passed on to who/whatever sent the call to my API. these are two pics of the data classes I have: http://imageshack.us/photo/my-images/703/facebookdata.png/ http://imageshack.us/photo/my-images/204/twitterdata.png/ probably not 100% correct but it gives an idea what it looks like. Now I've been having several idea about how to go about and generalize the two, which is harder then I thought at first. Create an interface (TimelineItem) and let the other classes extend that one. this way I'll always be sure I have a class that contains at least the basic info I need. downside is that deserializing the JSON seems to be a nightmare. Use the two dataclasses I have and combine them into a new class afterwards, then pass that one back to whoever requested it. This would probably work but I get the idea it's not the best way to tackle this problem, and is pretty dodgy IF I get it working. Or, in case of the other two being nearly impossible. Keep the two seperated in the front end, and go sit in the corner crying because I've just figured out you can't lump together facebook and twitter... Note: I don't have to make the front end part (view), I just make sure the Model is nicely filled with data :) I hope I placed this in the right section, if I didn't I apologise and would like to know where I should go with my question. Thanks in advance for any replied/ideas/opinions on this.

    Read the article

  • When does Information become Data? (i.e. Information wants to be free) [closed]

    - by James P. Wright
    I hear Programmers often talk about how Information Wants To Be Free which I mostly agree with, but the thing that people don't often pay attention to is that Information and Data are not the same thing. Should Data also be free? Does that mean all of you should have full access to my Social Security Number and other personal "information"? Where is the limit? If there is a limit, why do people throw this phrase around like it fits every circumstance (like this one)

    Read the article

  • How to save and retrieve data as key-value pairs or files in isolated storage?

    - by kaleidoscope
    One can use isolated storage to store data locally on the user's computer. There are two ways to use isolated storage. The first way is to save or retrieve data as key/value pairs by using the IsolatedStorageSettings class. The second way is to save or retrieve files by using the IsolatedStorageFile class. More details can be found at http://silverlight.net/learn/quickstarts/isolatedstorage/   Rituraj, J

    Read the article

  • Google ou le data warehouse mondial : Partage de connaissance ou possession du marché mondiale ?

    Google ou le data warehouse mondiale Partage de connaissance ou possession du marché mondiale ? Google a annoncé la mise en ligne de données supplémentaires nommé World Bank sur son outil public data explorer Via cet outil, vous trouvez toutes les informations mondiales concernant l'agriculture, la consommation électrique par capital, l'émission de CO2 par capital, le nombre d'utilisateurs d'internet,... Ainsi, vous avez par différents graphiques, des statistiques sur toutes les capitales dont vous pourrez comparer les différentes informations propres à ...

    Read the article

  • Recovering data from failed Raid configuration with 4 drives and two raid sets (Asus P6T / Intel ICH10r)

    - by user56365
    I've added the complete detailed version for my question below for those who can help, but want to quickly summarize my question first. I setup two Raid arrays using (4) WD Raptors, a striped set for the OS and 1+0 set for crucial data. After booting once out of the 50 times a cable fell out, the drive wasn't recognized in the array anymore. After trying to fix it, another drive did the same. I now have two drives remaining, luckily with the parity information. I know the striped set is gone, but I need the data on the other set. Can anyone recommend anything to recover the data, or fix the two drives that doesn't allow the raid controller to recognize the drives, even though they are listed on the utility screen as still apart of the configuration but that they are not found? More Details I recently upgraded to a ASUS P6T motherboard with an Intel ICH10R raid controller and changed my previous 4 drive raid array from strictly a Raid 1+0 set to a Raid 0 for the OS/Page/Scratch drive and a Raid 1+0 set for crucial data. I never had problems after upgrading with my configuration, even when a drive died and was replaced. I managed to rebuild the array fine. Unfortunately this time around, a cable came unattached and I booted my system up until the raid status screen with the degraded error. This shouldn't have been a problem, but after I attached the drive it was no longer recognized as a member in the array. Both drives actually show up as a non-member disk. I've spent a very, very long time online trying to find information or support and haven't had much luck. After spending time trying to scan the drive for errors, damaged partition info, etc.. another drive in the set decided it didn't want to be recognized as a part of the array. At this point, I have two out of the four drives still functioning, but the Raid 1+0 array went from degraded to failed and I must find a way to retrieve that data. I think the two drives still in the array have the parity information because they show up as OS (110GB),BACKUP(80GB) and OS:1(110GB),BACKUP(80GB) under windows data management. The other two are simply 74gb Raw unallocated Is it possible recover the data using those two drive only, and which tool would I use? Could it be a simple partition table or any other error that is repairable with hard drive utilities out there? I know the Raid 0 set is done for, but I would assume because the correct drives failed in a 1+0 config to save the data I can retrieve it some how.

    Read the article

  • How to install software that "Requires installation of untrusted packages"?

    - by user135682
    I am using Ubuntu 12.10. When I try to update the software, it shows me this error: Requires installation of untrusted packages This requires installing packages from unauthenticated sources. Details kalzium-data kanagram kate-data kde-l10n-engb kde-l10n-zhcn kde-runtime-data kdegames-data klettres-data libakonadi-kabc4 liblxc0 libwildmidi-config libwildmidi1 lxc marble-data nepomuk-core-data parley-data tomboy

    Read the article

  • Case studies for successful service (project) based software development businesses without constant overtime from its employees [closed]

    - by Ryan Taylor
    I work for an IT company that is primarily services (project) based rather than product based. All software engineers are salaried. The company has set new expectations that everyone should work 48 hours per week instead of 40. Note, this isn't occasional overtime due to crunches. This is the new 40. The reasoning is that this enables the company to provide benefits to its employees such as monetary incentives and training because the company is more profitable. more hours worked = more billable hours = larger profit I understand the need for profitability and the occasional crunch time and have put in the extra hours when it was needed and beneficial to the project. However, I am also very sensitive to work life balance and have raised my concerns about the the new expectation. My employer is open to other methods to increase profitability so I hold hope that we can turn things around before it becomes a horrible place to work. How does a services based company become more profitable without increasing the number of hours expected from it's salaried employees? Are there any case studies showing the pros and cons of consistent overtime? Are there any case studies for a successful service based business model (for software development companies) that does not require consistent overtime from its employees?

    Read the article

  • Windows Server 2003 - Are ODBC Data Source's set per-user?

    - by Jakobud
    When I'm logged into our Windows Server 2003 server, I don't see any ODBC Data Sources, but when a different user logs in (who doesn't have Administrative rights), they have a big list of ODBC Data Sources. Are ODBC Data Sources set on a per-user basis? How come the Administrator can't see user's ODBC Data Sources?

    Read the article

  • Facebook sort Presto, son moteur de requêtes open source pour le big data, qui serait dix fois plus performant que celui de Hadoop

    Facebook sort Presto, son moteur de requêtes open source pour le big data qui serait dix fois plus performant que celui de HadoopDe nombreuses entreprises comme Facebook dépendent du Big data. Dans le domaine, on compte la paire Hadoop/Hive parmi les références. Pour rappel, Hive c'est le moteur de requêtes populaire pour Hadoop. Cependant, il se pourrait que le MapReduce élément essentiel sur lequel repose Hive ne soit pas optimisé pour des situations ou la quantité de données excède un certain...

    Read the article

  • Australian Government Locator Service (AGLS) Metadata - Is it widely adopted?

    - by Brandrally
    Recently, I have seen in a couple sites around Australia's meta data AGLS tags. <meta name="AGLS.Audience" scheme="agls-audience" content="All"/> <meta name="DC.Publisher" scheme="AglsAgent" content="Hyundai"/> I have never seen this kind of mark-up before and discovered: http://www.agls.gov.au/ Just wondering whether there is a big community / support out there for the adopting these tags? Any thoughts would be great.

    Read the article

  • Les Bouches-du-Rhône sponsorisent des développeurs pour promouvoir l'open data : avez-vous proposé votre application ?

    Les Bouches-du-Rhône lancent une opération de sponsoring pour les développeurs d'applications et de sites web Afin de promouvoir l'open-data et la région, Developpez.com partenaire Depuis le mois d'avril, l'association Bouches-du-Rhône Tourisme a rejoint la démarche Open-data en ouvrant son portail et en libérant une centaine de jeux de données liées au tourisme dans le département. Toutes les informations détenues par l'association ont été rendues disponibles et exploitables par les développeurs, les scientifiques, les associations, les étudiants et les entreprises. Bref, par tout le monde. On y trouve par exemple...

    Read the article

  • Les Bouches-du-Rhône lancent une opération de sponsoring de développeurs pour promouvoir l'open-data et la région

    Les Bouches-du-Rhône lancent une opération de sponsoring pour les développeurs d'applications et de sites web Afin de promouvoir l'open-data et la région, Developpez.com partenaire Depuis le mois d'avril, l'association Bouches-du-Rhône Tourisme a rejoint la démarche Open-data en ouvrant son portail et en libérant une centaine de jeux de données liées au tourisme dans le département. Toutes les informations détenues par l'association ont été rendues disponibles et exploitables par les développeurs, les scientifiques, les associations, les étudiants et les entreprises. Bref, par tout le monde. On y trouve par...

    Read the article

  • Should I be relying on WebTests for data validation?

    - by Alexander Kahoun
    I have a suite of web tests created for a web service. I use it for testing a particular input method that updates a SQL Database. The web service doesn't have a way to retrieve the data, that's not its purpose, only to update it. I have a validator that validates the response XML that the web service generates for each request. All that works fine. It was suggested by a teammate that I add data validation so that I check the database to see the data after the initial response validator runs and compare it with what was in the input request. We have a number of services and libraries that are separate from the web service I'm testing that I can use to get the data and compare it. The problem is that when I run the web test the data validation always fails even when the request succeeds. I've tried putting the thread to sleep between the response validation and the data validation but to no avail; It always gets the data from before the response validation. I can set a break point and visually see that the data has been updated in the DB, funny thing is when I step through it in debug with the breakpoint it does validate successfully. Before I get too much more into this issue I have to ask; Is this the purpose of web tests? Should I be able to validate data through service calls in this manner or am I asking too much of a web test and the response validation is as far as I should go?

    Read the article

  • Error while deploying a web application in OSGI container using pax web

    - by RaulDM
    Hello I am trying to deploy a web application in a Felix container. I have all the required configuration done with my web app like the setting up of the manifest headers: Webapp-Context: Bundle-ClassPath: Bundle-Activator: Import-Package: Bundle-SymbolicName: etc The Pax bundles that I have dropped in the same container are: pax-web-service-0.6.0.jar pax-web-jsp-0.7.1.jar pax-web-extender-war-0.7.1.jar pax-logging-service-1.5.0.jar pax-logging-api-1.5.0.jar Though it had been written in the pax web site that pax-web-service is included in pax-war-extender, it seems without pax-web-service bundle, all other bundles become handicapped. I had removed the other pax bundles like pax-web-extender-whiteboard-0.7.1.jar pax-web-jetty-0.7.1.jar, as I have not seen any usefulness of those. The pax-web-jetty-0.7.1.jar even does not get start up. it has dependencies which it could not be able to resolve from any one of the bundle provided by PAX. My browser is displaying: HTTP ERROR 403 Problem accessing /adminmodule/. Reason: FORBIDDEN Powered by Jetty:// while the Console log says: [5884890@qtp-16567002-0 - /adminmodule/] DEBUG org.mortbay.jetty - REQUEST /adminmodule/ on org.mortbay.jetty.HttpConnection@1e94001 [5884890@qtp-16567002-0 - /adminmodule/] DEBUG org.ops4j.pax.web.service.internal.model.ServerModel - Matching [/adminmodule/]... [5884890@qtp-16567002-0 - /adminmodule/] DEBUG org.ops4j.pax.web.service.internal.model.ServerModel - Path [/adminmodule/] matched to {pattern=/adminmodule/.*,model=ResourceModel{id=org.ops4j.pax.web.service.internal.model.ResourceModel-2,name=,urlPatterns=[/],alias=/,servlet=ResourceServlet{context=/adminmodule,alias=/,name=},initParams={},context=ContextModel{id=org.ops4j.pax.web.service.internal.model.ContextModel-1,name=adminmodule,httpContext=org.ops4j.pax.web.extender.war.internal.WebAppWebContainerContext@11710be,contextParams={webapp.context=adminmodule}}}} [5884890@qtp-16567002-0 - /adminmodule/] DEBUG org.ops4j.pax.web.service.internal.HttpServiceContext - Handling request for [/adminmodule/] using http context [org.ops4j.pax.web.extender.war.internal.WebAppWebContainerContext@11710be] [5884890@qtp-16567002-0 - /adminmodule/] DEBUG org.mortbay.jetty - sessionManager=org.mortbay.jetty.servlet.HashSessionManager@19c6163 [5884890@qtp-16567002-0 - /adminmodule/] DEBUG org.mortbay.jetty - session=null [5884890@qtp-16567002-0 - /adminmodule/] DEBUG org.mortbay.jetty - servlet= [5884890@qtp-16567002-0 - /adminmodule/] DEBUG org.mortbay.jetty - chain=org.ops4j.pax.web.service.internal.model.FilterModel-3- [5884890@qtp-16567002-0 - /adminmodule/] DEBUG org.mortbay.jetty - servlet holder= [5884890@qtp-16567002-0 - /adminmodule/] DEBUG org.mortbay.jetty - call filter org.ops4j.pax.web.service.internal.model.FilterModel-3 [5884890@qtp-16567002-0 - /adminmodule/] DEBUG org.ops4j.pax.web.service.internal.WelcomeFilesFilter - Apply welcome files filter... [5884890@qtp-16567002-0 - /adminmodule/] DEBUG org.ops4j.pax.web.service.internal.WelcomeFilesFilter - Servlet path: / [5884890@qtp-16567002-0 - /adminmodule/] DEBUG org.ops4j.pax.web.service.internal.WelcomeFilesFilter - Path info: null [5884890@qtp-16567002-0 - /adminmodule/] INFO org.ops4j.pax.web.service.internal.HttpServiceContext - getting resource: [/adminmodule.jsp] [5884890@qtp-16567002-0 - /adminmodule/] DEBUG org.ops4j.pax.web.extender.war.internal.WebAppWebContainerContext - Searching bundle [com.cisco.zaloni.gwt.admin [1]] for resource [/adminmodule.jsp], normalized to [adminmodule.jsp] [5884890@qtp-16567002-0 - /adminmodule/] DEBUG org.ops4j.pax.web.extender.war.internal.WebAppWebContainerContext - Resource not found [5884890@qtp-16567002-0 - /adminmodule/] INFO org.ops4j.pax.web.service.internal.HttpServiceContext - found resource: null [5884890@qtp-16567002-0 - /adminmodule/] DEBUG org.mortbay.jetty - call servlet [5884890@qtp-16567002-0 - /adminmodule/] DEBUG org.ops4j.pax.web.extender.war.internal.WebAppWebContainerContext - Searching bundle [com.cisco.zaloni.gwt.admin [1]] for resource [/], normalized to [/] [5884890@qtp-16567002-0 - /adminmodule/] DEBUG org.ops4j.pax.web.extender.war.internal.WebAppWebContainerContext - Resource found as url [bundle://1.0:1/] [5884890@qtp-16567002-0 - /adminmodule/] DEBUG org.mortbay.jetty - RESPONSE /adminmodule/ 403 It is really frustrating. please help. as I am new to OSGI. Raul

    Read the article

  • PHP DomDocument class unable access domnode

    - by turbod
    Hi. I dont parse this url: http://foldmunka.net $ch = curl_init("http://foldmunka.net"); //curl_setopt($ch, CURLOPT_NOBODY, true); curl_setopt($ch, CURLOPT_RETURNTRANSFER, true); //curl_setopt($ch, CURLOPT_HEADER, true); curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true); //not necessary unless the file redirects (like the PHP example we're using here) $data = curl_exec($ch); $info = curl_getinfo($ch); curl_close($ch); clearstatcache(); if ($data === false) { echo 'cURL failed'; exit; } $dom = new DOMDocument(); $data = mb_convert_encoding($data, 'HTML-ENTITIES', "utf-8"); $data = preg_replace('/<\!\-\-\[if(.*)\]>/', '', $data); $data = str_replace('<![endif]-->', '', $data); $data = str_replace('<!--', '', $data); $data = str_replace('-->', '', $data); $data = preg_replace('@<script[^>]*?>.*?</script>@si', '', $data); $data = preg_replace('@<style[^>]*?>.*?</style>@si', '', $data); $data = mb_convert_encoding($data, 'HTML-ENTITIES', "utf-8"); @$dom->loadHTML($data); $els = $dom->getElementsByTagName('*'); foreach($els as $el){ print $el->nodeName." | ".$el->getAttribute('content')."<hr />"; if($el->getAttribute('title'))$el->nodeValue = $el->getAttribute('title')." ".$el->nodeValue; if($el->getAttribute('alt'))$el->nodeValue = $el->getAttribute('alt')." ".$el->nodeValue; print $el->nodeName." | ".$el->nodeValue."<hr />"; } I need the alt, title attributes and the simple text, but this page i cannot access the nodes within the body tag.

    Read the article

  • How can I 'transpose' my data using SQL and remove duplicates at the same time?

    - by Remnant
    I have the following data structure in my database: LastName FirstName CourseName John Day Pricing John Day Marketing John Day Finance Lisa Smith Marketing Lisa Smith Finance etc... The data shows employess within a business and which courses they have shown a preference to attend. The number of courses per employee will vary (i.e. as above, John has 3 courses and Lisa 2). I need to take this data from the database and pass it to a webpage view (asp.net mvc). I would like the data that comes out of my database to match the view as much as possible and want to transform the data using SQl so that it looks like the following: LastName FirstName Course1 Course2 Course3 John Day Pricing Marketing Finance Lisa Smith Marketing Finance Any thoughts on how this may be achieved? Note: one of the reasons I am trying this approach is that the original data structure does not easily lend itself to be iterated over using the typical mvc syntax: <% foreach (var item in Model.courseData) { %> Because of the duplication of names in the orignal data I would end up with lots of conditionals in my View which I would like to avoid. I have tried transforming the data using c# in my ViewModel but have found it tough going and feel that I could lighten the workload by leveraging SQL before I return the data. Thanks.

    Read the article

  • C#/.Net Error: MySql.Data Object reference not set to an instance of an object.

    - by Simon
    Hello, I get this exception on my Windows 7 64bit in application running in VS 2008 express. I am using Connector/Net 6.2.2.0: Message: Object reference not set to an instance of an object. Source: MySql.Data in MySql.Data.MySqlClient.NativeDriver.GetResult(Int32& affectedRow, Int32& insertedId) Stack trace: in MySql.Data.MySqlClient.Driver.GetResult(Int32 statementId, Int32& affectedRows, Int32& insertedId) in MySql.Data.MySqlClient.Driver.NextResult(Int32 statementId) in MySql.Data.MySqlClient.MySqlDataReader.NextResult() in MySql.Data.MySqlClient.MySqlDataReader.Close() in MySql.Data.MySqlClient.MySqlConnection.Close() in MySql.Data.MySqlClient.MySqlConnection.Dispose(Boolean disposing) in System.ComponentModel.Component.Finalize() No inner exception. This exception is unhalted and the debugger dont point on any code line. It just say "Object reference not set to an instance of an object. MySql.Data" This error is really hard to repeat. On my Windows XP 32bit is all ok. Could it be error in 64bit Windows 7? Thank you very much for your answers. Regards, simon

    Read the article

  • Am I correctly extracting JPEG binary data from this mysqldump?

    - by Glenn
    I have a very old .sql backup of a vbulletin site that I ran around 8 years ago. I am trying to see the file attachments that are stored in the DB. The script below extracts them all and is verified to be JPEG by hex dumping and checking the SOI (start of image) and EOI (end of image) bytes (FFD8 and FFD9, respectively) according to the JPEG wiki page. But when I try to open them with evince, I get this message "Error interpreting JPEG image file (JPEG datastream contains no image)" What could be going on here? Some background info: sqldump is around 8 years old vbulletin 2.x was the software that stored the info most likely php 4 was used most likely mysql 4.0, possibly even 3.x the column datatype these attachments are stored in is mediumtext My Python 3.1 script: #!/usr/bin/env python3.1 import re trim_l = re.compile(b"""^INSERT INTO attachment VALUES\('\d+', '\d+', '\d+', '(.+)""") trim_r = re.compile(b"""(.+)', '\d+', '\d+'\);$""") extractor = re.compile(b"""^(.*(?:\.jpe?g|\.gif|\.bmp))', '(.+)$""") with open('attachments.sql', 'rb') as fh: for line in fh: data = trim_l.findall(line)[0] data = trim_r.findall(data)[0] data = extractor.findall(data) if data: name, data = data[0] try: filename = 'files/%s' % str(name, 'UTF-8') ah = open(filename, 'wb') ah.write(data) except UnicodeDecodeError: continue finally: ah.close() fh.close() update The JPEG wiki page says FF bytes are section markers, with the next byte indicating the section type. I see some that are not listed in the wiki page (specifically, I see a lot of 5C bytes, so FF5C). But the list is of "common markers" so I'm trying to find a more complete list. Any guidance here would also be appreciated.

    Read the article

  • WCF Service error received when using TCP: "The message could not be dispatched..."

    - by StM
    I am new to creating WCF services. I have created a WCF web service in VS2008 that is running on IIS 7. When I use http the service works perfectly. When I configure the service for TCP and run I get the following error message. There was a communication problem. The message could not be dispatched because the service at the endpoint address 'net:tcp://elec:9090/CoordinateIdTool_Tcp/IdToolService.svc is unavailable for the protocol of the address. I have searched a lot of forums, including this one, for a resolution but nothing has worked. Everything appears to be set up correctly on IIS 7. WAS has been set up to run. The default web site has a net.tcp binding and the application has net.tcp under the enabled protocols. I am including what I think is the important part of the web.config from the host project and also the app.config from the client project I am using to test the service. Hopefully someone can spot my error. Thanks in advance for any help or recommendations that anyone can provide. Web.Config <bindings> <wsHttpBinding> <binding name="wsHttpBindingNoMsgs"> <security mode="None" /> </binding> </wsHttpBinding> </bindings> <services> <service behaviorConfiguration="CogIDServiceHost.ServiceBehavior" name="CogIDServiceLibrary.CogIdService"> <endpoint address="" binding="wsHttpBinding" bindingConfiguration="wsHttpBindingNoMsgs" contract="CogIDServiceLibrary.CogIdTool"> <identity> <dns value="localhost" /> </identity> </endpoint> <endpoint address="mex" binding="mexHttpBinding" bindingConfiguration="" contract="IMetadataExchange" /> <endpoint name="CoordinateIdService_TCP" address="net.tcp://elec:9090/CoordinateIdTool_Tcp/IdToolService.svc" binding="netTcpBinding" bindingConfiguration="" contract="CogIDServiceLibrary.CogIdTool"> <identity> <dns value="localhost" /> </identity> </endpoint> </service> </services> <behaviors> <serviceBehaviors> <behavior name="CogIDServiceHost.ServiceBehavior"> <serviceMetadata httpGetEnabled="true" /> <serviceDebug includeExceptionDetailInFaults="false" /> </behavior> </serviceBehaviors> </behaviors> App.Config <system.serviceModel> <diagnostics performanceCounters="Off"> <messageLogging logEntireMessage="true" logMalformedMessages="false" logMessagesAtServiceLevel="false" logMessagesAtTransportLevel="false" /> </diagnostics> <behaviors /> <bindings> <wsHttpBinding> <binding name="WSHttpBinding_CogIdTool" closeTimeout="00:01:00" openTimeout="00:01:00" receiveTimeout="00:10:00" sendTimeout="00:01:00" bypassProxyOnLocal="false" transactionFlow="false" hostNameComparisonMode="StrongWildcard" maxBufferPoolSize="524288" maxReceivedMessageSize="65536" messageEncoding="Text" textEncoding="utf-8" useDefaultWebProxy="true" allowCookies="false"> <readerQuotas maxDepth="32" maxStringContentLength="8192" maxArrayLength="16384" maxBytesPerRead="4096" maxNameTableCharCount="16384" /> <reliableSession ordered="true" inactivityTimeout="00:10:00" enabled="false" /> <security mode="None"> <transport clientCredentialType="Windows" proxyCredentialType="None" realm="" /> <message clientCredentialType="Windows" negotiateServiceCredential="true" establishSecurityContext="true" /> </security> </binding> <binding name="wsHttpBindingNoMsg"> <security mode="None"> <transport clientCredentialType="Windows" /> <message clientCredentialType="Windows" /> </security> </binding> </wsHttpBinding> </bindings> <client> <endpoint address="http://sdet/CogId_WCF/IdToolService.svc" binding="wsHttpBinding" bindingConfiguration="wsHttpBindingNoMsg" contract="CogIdServiceReference.CogIdTool" name="IISHostWsHttpBinding"> <identity> <dns value="localhost" /> </identity> </endpoint> <endpoint address="http://localhost:1890/IdToolService.svc" binding="wsHttpBinding" bindingConfiguration="WSHttpBinding_CogIdTool" contract="CogIdServiceReference.CogIdTool" name="WSHttpBinding_CogIdTool"> <identity> <dns value="localhost" /> </identity> </endpoint> <endpoint address="http://elec/CoordinateIdTool/IdToolService.svc" binding="wsHttpBinding" bindingConfiguration="wsHttpBindingNoMsg" contract="CogIdServiceReference.CogIdTool" name="IIS7HostWsHttpBinding_Elec"> <identity> <dns value="localhost" /> </identity> </endpoint> <endpoint address="net.tcp://elec:9090/CoordinateIdTool_Tcp/IdToolService.svc" binding="netTcpBinding" bindingConfiguration="" contract="CogIdServiceReference.CogIdTool" name="IIS7HostTcpBinding_Elec" > <identity> <dns value="localhost"/> </identity> </endpoint> </client> </system.serviceModel>

    Read the article

< Previous Page | 567 568 569 570 571 572 573 574 575 576 577 578  | Next Page >