Search Results

Search found 19134 results on 766 pages for 'support contract'.

Page 346/766 | < Previous Page | 342 343 344 345 346 347 348 349 350 351 352 353  | Next Page >

  • 0.00006103515625 GB of RAM. Is .NET MicroFramework part of Windows CE?

    - by Rocket Surgeon
    The .NET MicroFramework claims to work on 64K RAM and has list of compatible targets vendors. At the same time, same vendors who ship hardware and create Board Support Packages (vendors like Adeneo) keep releasing something named Windows 7 CE BSP for the same hardware targets. Obviously the OS as heavy as WinCE needs more than 64K RAM. So, somehow .NET MicroFramework is relevant to WinCE, but how ? Is it part of bigger OS or is it base of it, or are both mutually exclusive ? Background: 0.00006103515625 GByte of RAM is same as 64Kbyte of RAM. I am looking for possiblity to use Microsoft development tools for small target like BeagleBone. http://www.adeneo-embedded.com/About-Us/News/Release-of-TI-BeagleBone Nice. Now .. where is a MicroFramework for the same beaglebone ? Is it inside the released pile ?

    Read the article

  • Scalable distributed file system for blobs like images and other documents

    - by Pinnacle
    Cassandra & HBase both do not efficiently support storage of blobs like images. Storing directly on HDFS stresses the Namenode because of huge number of files. Facebook uses Haystack for images and attachments storage, but this is not open source. So is Lustre a good choice for distributed blob storage? I have read that Amazon S3 is used by many, but this would cost money and personally, I would not like to rely on third party system. What are other suggestions?

    Read the article

  • Are your merchandise systems limiting growth? Oracle Retail's Merchandise Operations Management could be the answer

    - by user801960
    In this video, Lara Livgard, Director of Oracle Retail Strategy, introduces Oracle Retail Merchandise Operations Management (MOM), a set of integrated, modular solutions that support buying, pricing, inventory management and inventory valuation across a retailer’s channels, countries, and business models. MOM is the backbone of successful retail operations, providing timely and accurate visibility across the entire enterprise and enabling efficient supply-chain execution driven by plans and forecasts. It's modular architecture facilitates tailored and high-value implementations, giving retailers the information they need in order to offer a quality customer experience through a truly integrated multi-channel approach. Further information is available on the Oracle Retail website regarding Merchandise Operations Management.

    Read the article

  • How to convince my boss to improve code quality?

    - by Vimvq1987
    The place I'm working for is a service provider. We have a lot of services, which are written to deal with deadline, so their code are really terrible: No coding convention, everyone codes in his own style No unit testing (which is really bad) No refactoring (which is truly worse) No automation build/deployment etc and these code are used again and again, so bad code continue to spread all over my department. I really want to set up a standard quality for our code, by requiring everyone to follow "rules": every line of code which does not follow convention will be rejected, and every function of code which does not pass unit testing will not be committed,...But I don't know how to convince my boss to allow me to do this. I'm relatively new comer, so inspiring people from my works is really hard, and I think it's easier if my boss support me to this. Thank you very much for your advices

    Read the article

  • Oracle BI Publisher integration Guidelines published

    - by Anthony Shorten
    The Oracle Utilities Application Framework integrates with Oracle BI Publisher 10g/11g to allow reports to be submitted and viewed within the Oracle Utilities Application Framework based product. The integration is achieved using base provided algorithms that perform the integration from within the Oracle Utilities Application Framework based product. A new whitepaper has been released that outlines how to configure the algorithm for integration and also some guidelines on how to define a new report to the product for submission and viewing. The whitepaper is available from My Oracle Support at KB Id 1299732.1 titled “BI Publisher Guidelines for Oracle Utilities Application Framework”.

    Read the article

  • SQL Peer-to-Peer Dynamic Structured Data Processing Collaboration

    Unstructured and XML semi-structured data is now used more than structured data. But fixed structured data still keeps businesses running day in and day out, which requires consistent predictable highly principled processing for correct results. For this reason, it would be very useful to have a general purpose SQL peer-to-peer collaboration capability that can utilize highly principled hierarchical data processing and its flexible and advanced structured processing to support dynamically structured data and its dynamic structured processing. This flexible dynamic structured processing can change the structure of the data as necessary for the required processing while preserving the relational and hierarchical data principles. This processing will perform freely across remote unrelated peer locations anytime and transparently process unpredictable and unknown structured data and data type changes automatically for immediate processing using automatic metadata maintenance.

    Read the article

  • What technology should I concentrate on for mobile development? [closed]

    - by Rob2211
    Firstly, I have many years experience with C# & .NET and some with Java. But, rather than committing to Java and developing native applications for Andriod I have been researching cross-platform deployment technologies. Currently, the most powerful cross-platform technology seems to be Flash, using Adobe AIR to package software as native applications. But given Adobe's announcement that it will discontinue support for the Flash Player on mobile devices it seems foolish (at this late stage) to invest in Flash and ActionScript as a developer. There has been speculation that Microsoft are also planning their exit strategy for Silverlight in favour of HTML5. So, my questions are; What is the most appropriate technology to invest in and learn in order to build cross-platform mobile applications / games while future proofing my skills as a developer? Is HTML5 mature enough to fill the 'Flash void' and be used to start building cross-platform, rich, interactive, networked mobile applications / games now? N.B. For HTML5 read (HTML5/CSS3/JavaScript)

    Read the article

  • What features would you like to have in PHP? [closed]

    - by StasM
    Since it's the holiday season now and everybody's making wishes, I wonder - which language features you would wish PHP would have added? I am interested in some practical suggestions/wishes for the language. By practical I mean: Something that can be practically done (not: "I wish PHP would guess what my code means and fix bugs for me" or "I wish any code would execute under 5ms") Something that doesn't require changing PHP into another language (not: "I wish they'd drop $ signs and use space instead of braces" or "I wish PHP were compiled, statically typed and had # in it's name") Something that would not require breaking all the existing code (not: "Let's rename 500 functions and change parameter order for them") Something that does change the language or some interesting aspect of it (not: "I wish there was extension to support for XYZ protocol" or "I wish bug #12345 were finally fixed") Something that is more than a rant (not: "I wish PHP wouldn't suck so badly") Anybody has any good wishes? Mod edit: Stanislav Malyshev is a core PHP developer.

    Read the article

  • Moving from XNA/C# to DirectX/C++ quite confused

    - by misiMe
    I made some game with XNA/C# for Windows Phone and Windows 8, since XNA is dead and Visual studio doesn't support it (I have to target Windows Phone 7.1 to build with XNA), I want to start learning something more "consistent in time" and improve my skills. I'm a little confused about the possibilities, because C++/DirectX alone seems difficult, so I found some high-level classes to help: DirectX Toolkit Cocos2D My questions are: What will happen when they will "die" like XNA? Is C++'s approces more "professional" than C#/XNA and why? Is C++'s approces more "portable"? Is C++'s approces more resistant in terms of time? Is there any consideration about DirectX TK and Cocos2D in terms of performance? I ask that because I found that every Game software house in my country looks for skilled C++ programmers.

    Read the article

  • Access Control Service v2

    - by Your DisplayName here!
    A Resource-STS (others call it RP-STS or federation gateway) is a necessity for non-trivial federated identity scenarios. ADFS v2 does an excellent job in fulfilling that role – but (as of now) you have to run ADFS on-premise. The Azure Access Control Service is a Resource-STS in the cloud (with all the usual scalability/availability) promises. Unfortunately a lot of (the more interesting) features in ACS v1 had to be cut due to constrained time/resources. The good news is that ACS v2 is now in CTP and brings back a lot of the missing features (like WS* support) and adds some really sweet new ones (out of the box federation with Google, Facebook, LiveID – and OpenId in general). You can read about the details here. On a related note – ACS v2 works out of the box with StarterSTS – simply choose the ADFS v2 option and point the management portal to the StarterSTS WS-Federation metadata endpoint. Have fun ;)

    Read the article

  • copy & paste in VirtualBox remote console when running headless

    - by katsumii
    One can run VirtualBox guest in "headless" mode and access it using Remote Destkop Protocol(RDP) client.This is typical when VBox server is installed on Linux/Solaris where X-window stuff is not installed and users useWindows to access the VM.So, one can install OS into VBox guests using Remote Desktop client.(e.g. mstsc.exe)Here's the setting. One lesser known feature here is that you can copy&paste into and out-of VM guest and your client.Apparently, "VMware Workstation" still doesn't support it. VMware Workstation Documentation CenterYou cannot copy and paste text between the host system and the guest operating system   

    Read the article

  • Adaptive Layout for ADF Faces on Tablets

    - by Shay Shmeltzer
    In the 11.1.16 version of Oracle ADF we started adding specific features to the ADF Faces components so they'll work better on iPad tablets. In this entry I'm going to highlight some new capabilities that we have added to the 11.1.2.3 release. (note if you are still on the 11.1.1.* branch - you'll need to wait for 11.1.1.7 to get the features discussed here). The two key additions in the 11.1.2.3 version compared to the 11.1.1.6 features for iPad support include: pagination for tables and adaptive flow layout. The pagination for table is self explanatory, basically since iPad don't support scroll bars, we automatically switch the table component to render with a pagination toolbar that allow you to scroll set of records or directly jump to a specific set. See the image below. The adaptive flow layout takes a bit more explanation. On regular desktops the UI that you usually build for ADF Faces screens is going to use stretch layout - meaning that it stretches to fill the whole area of the browser window. If you resize the browser windoe, the ADF Faces page resizes with it. If your browser window is too small, scroll bars will appear to allow you to scroll to areas that are "hidden". However on an iPad, this is probably not the type of layout you want - you would rather have a flow layout that eliminates scroll bars and instead allows you to scroll down the page. Basically your want the page to be sized based on its content, rather then based on the browser window size. In ADF Faces terminology this can be done with the dimensionsFrom property set to "children". And here comes the tricky part, since in the past(and also today) when you create an ADF Faces page and add a stretchable component to it, the dimensionsFrom property is set to parent by default. This will be true to other layout components you'll add as well. At this point you might be wondering "Does this mean I'll need to go to each of the layout components in my page and modify the dimensionsFrom property value to be children?" ADF Faces to the rescue... To eliminate the need to do this tedious manual changes, we introduced a new web.xml parameter "oracle.adf.view.rich.geometry.DEFAULT_DIMENSIONS" You'll basically add the following to your web.xml <context-param>    <description>      This parameter controls the default value for component geometry on the page.      Supported values are:        legacy - component attributes use the default values as specified for the attributes                 in the tag documentation (default value)        auto   - component attributes use the correct default value given the value of their                 parent component. For example, with this setting, the panelStretchLayout                 will use "auto" as the default value for its "dimensionsFrom" attribute                 instead of "parent".    </description>    <param-name>oracle.adf.view.rich.geometry.DEFAULT_DIMENSIONS</param-name>    <param-value>auto</param-value>  </context-param> Once you set this parameter, you only need to set the dimensionsFrom attribute for the top level layout component on your page, and the rest of the components will adjust accordingly. One trick that you can use, and that is used in the demo below, is to have the dimensionsFrom property depend on the type of client that access your application. This way you can switch between stretch or flow layout based on the device accessing your application. For example I use the following in my page: <af:panelStretchLayout topHeight="70px" startWidth="0px" endWidth="0px"                                       dimensionsFrom="#{adfFacesContext.agent.capabilities['touchScreen'] eq 'none'  ? 'parent' : 'children' }"> Which results in a flow layout for iPads and a stretch layout for regular browsers. Check out the result in the below demo: &amp;lt;span id=&amp;quot;XinhaEditingPostion&amp;quot;&amp;gt;&amp;lt;/span&amp;gt;

    Read the article

  • Upcoming Customer WebCast: Adapters and JCA Transport in Oracle Service Bus 11g

    - by MariaSalzberger
    There is an upcoming webcast planned for September 19th that will show how to implement services using a JCA adapter in Oracle Service Bus 11g. The session will help to utilize existing resources like samples and information centers for adapters in the context of Oracle Service Bus. Topics covered in the webcast are: JCA Transport Overview / Inbound and Outbound scenarios using JCA adapters Implementation of an end-to-end use case using an inbound file adapter and and an outbound database adapter in Oracle Service Bus It will show how to find information on supported adapters in a certain version of OSB 11g Available adapter samples for OSB and SOA How to use SOA adapter samples for Oracle Service Bus A live demo of an adapter sample implementation in Oracle Service Bus Information Centers for adapters and Oracle Service Bus information The presentation recording can by found here after the webcast. Select "Oracle Fusion Middleware" as product. (https://support.oracle.com/rs?type=doc&id=740966.1) The schedule for future webcasts can be found in the above mentioned document as well.

    Read the article

  • Dynamic tests with mstest and T4

    - by Victor Hurdugaci
    If you used mstest and NUnit you might be aware of the fact that the former doesn't support dynamic, data driven test cases. For example, the following scenario cannot be achieved with the out-of-box mstest: given a dataset, create distinct test cases for each entry in it, using a predefined generic test case. The best result that can be achieved using mstest is a single testcase that will iterate through the dataset. There is one disadvantage: if the test fails for one entry in the dataset, the whole test case fails. So, in order to overcome the previously mentioned limitation, I decided to create a text template that will generate the test cases for me. As an example, I will write some tests for an integer multiplication function that has 2 bugs in it: Read more >> [Cross post from victorhurdugaci.com]

    Read the article

  • Oracle OpenWorld 2013 Summary

    - by JuergenKress
    Did you miss the Oracle OpenWorld 2013 – here are the key information from Thomas Kurian’s middleware presentation, our partners presentations and the first impressions on SOA Suite 12c. Thanks to all partners for the excellent presentations and the product management team for the superb demo ground! Oracle OpenWorld General Session 2013: Middleware Watch Full-Length Keynote JavaOne keynote At our WebLogic Community Workspace (WebLogic Community membership required): you can download the presentation slides from Thomas Kurian’s presentation FMWGeneralSessionTKv22.pptx. Download all session slides here. WebLogic Partner Community For regular information become a member in the WebLogic Partner Community please visit: http://www.oracle.com/partners/goto/wls-emea ( OPN account required). If you need support with your account please contact the Oracle Partner Business Center. Blog Twitter LinkedIn Mix Forum Wiki Technorati Tags: OOW,ORacle OpenWorld,WebLogic 12c training,education,WebLogic,WebLogic Community,Oracle,OPN,Jürgen Kress

    Read the article

  • The Internet from a 1990s Point of View [Video]

    - by Asian Angel
    Are you ready for a retro look at the Internet? Then prepare to journey back in time to 1995 with this video and its view of the early days of the Internet. From YouTube: Trine Gallegos hosts this segment shot in 1995 when the Internet was first becoming an icon. This is an interesting look back at how clunky the applications were. I don’t even think they were using a computer mouse yet. Internet – from the 1990′s point of view [via Fail Desk] How to Own Your Own Website (Even If You Can’t Build One) Pt 1 What’s the Difference Between Sleep and Hibernate in Windows? Screenshot Tour: XBMC 11 Eden Rocks Improved iOS Support, AirPlay, and Even a Custom XBMC OS

    Read the article

  • Catalyst 12.1 for AGP card

    - by Brian
    I have a Radeon HD 4650 AGP card. The installation instructions at http://wiki.cchtml.com/index.php/Ubuntu_Oneiric_Installation_Guide download catalyst 12.1 However, there is no download available at http://support.amd.com/us/gpudownload/Pages/index.aspx for my AGP card. Only Windows drivers are available. Should I go ahead and use the guide and manually install 12.1? I assume it's not on the amd website for a reason, but I was wondering if anybody had any experience with this. Thanks!

    Read the article

  • SO-Aware Service Explorer – Configure and Export your services from VS 2010 into the repository

    - by cibrax
    We have introduced a new Visual Studio tool called “Service Explorer” as part of the new SO-Aware SDK version 1.3 to help developers to configure and export any regular WCF service into the SO-Aware service repository. This new tool is a regular Visual Studio Tool Window that can be opened from “View –> Other Windows –> Services Explorer”. Once you open the Services Explorer, you will able to see all the available WCF services in the Visual Studio Solution. In the image above, you can see that a “HelloWorld” service was found in the solution and listed under the Tool window on the left. There are two things you can do for a new service in tool, you can either export it to SO-Aware repository or associate it to an existing service version in the repository. Exporting the service to SO-Aware means that you want to create a new service version in the repository and associate the WCF service WSDL to that version. Associating the service means that you want to use a version already created in SO-Aware with the only purpose of managing and centralizing the service configuration in SO-Aware. The option for exporting a service will popup a dialog like the one bellow in which you can enter some basic information about the service version you want to create and the repository location. The option for associating a service will popup a dialog in which you can pick any existing service version repository and the application configuration file that you want to keep in sync for the service configuration. Two options are available for configuring a service, WCF Configuration or SO-Aware. The WCF Configuration option just tells the tool that the service will use the standard WCF configuration section “system.serviceModel” but that section must be updated and kept in sync with the configuration selected for the service in the repository. The SO-Aware configuration option will tell the tool that the service configuration will be resolved at runtime from the repository. For example, selecting SO-Aware will generate the following configuration in the selected application configuration file, <configuration> <configSections> <section name="serviceRepository" type="Tellago.ServiceModel.Governance.ServiceConfiguration.ServiceRepositoryConfigurationSection, Tellago.ServiceModel.Governance.ServiceConfiguration" /> </configSections> <serviceRepository url="http://localhost/soaware/servicerepository.svc"> <services> <service name="ref:HelloWorldService(1.0)@dev" type="SOAwareSampleService.HelloWorldService" /> </services> </serviceRepository> </configuration> As you can see the tool represents a great addition to the toolset that any developer can use to manage and centralize configuration for WCF services. In addition, it can be combined with other useful tools like WSCF.Blue (Web Service Contract First) for generating the service artifacts like schemas, service code or the service WSDL itself.

    Read the article

  • How can I upgrade to Lubuntu 14.04.1 withot a PAE kernel?

    - by Richard
    On my Dell Latitude D800 laptop, which has an old Pentium M without PAE support, I was able to successfully install Lubuntu 14.04 from the CD. When I try to upgrade, I get the error: PAE not enabled Your system uses a CPU that does not have PAE enabled. Ubuntu only supports non-PAE systems up to Ubuntu 12.04. To upgrade to a later version of Ubuntu, you must enable PAE (if this is possible) see: http://help.ubuntu.com/community/EnablingPAE I have seen the Communit Wiki page for PAE. The suggestion is to add forcepae to the command line options. When I do this, I get the same message. Moreover, dmesg does not indicate PAE was ever enabled. Is there anything else I can ry to get Lubuntu to upgrade correctly?

    Read the article

  • Virtual Developer Day: Oracle WebLogic Server & Java EE (#OTNVDD)

    - by Justin Kestelyn
    Virtual Developer Day is back with a vengeance! On Feb. 1, login to learn how Oracle WebLogic Server enables a whole new level of productivity for enterprise developers. Also hear the latest on Java EE 6 and the programming tenets that have made it a true platform breakthrough, and get hands-on with our VirtualBox virtual machine image! Even better, you never have to leave your desk - you'll get access to live sessions with chat support, and even 1-1 desktop sharing upon request. It's a no-brainer, get registered!

    Read the article

  • Internal and external API architecture

    - by Tacomanator
    The company I work for maintains a successful SaaS product that grew "organically" over the years. We are planning to expand the line with a suite of new products that will share data with the existing product. To support this, we are looking to consolidate business logic into a single place: a web service layer. The WS layer will be used by: The web applications A tool to import data A tool to integrate with other client software (not an API per se) We also want to create an API that can be used by our customers that are capable of using it to create their own integrations. We are struggling with the following question: Should the internal API (aka the WS layer) and the external API be one in the same, with security and permission settings to control what can be done by who, or should they be two separate applications where the external API just calls the internal API like any other application? So far in our debate it seems that separating them may be more secure, but will add overhead. What have others done in a similar situation?

    Read the article

  • New Recipes for WebLogic 12.1.2

    - by JuergenKress
    New Recipes for Oracle WebLogic 12.1.2 Oracle ACE Director Frank Munz talks about the new content to be found in the newly published second edition of his book "Oracle WebLogic Server 12c: Distinctive Recipes: Architecture, Development and Administration," and about some of his favorite features in WebLogic 12.1.2. Watch the video here. WebLogic Partner Community For regular information become a member in the WebLogic Partner Community please visit: http://www.oracle.com/partners/goto/wls-emea ( OPN account required). If you need support with your account please contact the Oracle Partner Business Center. Blog Twitter LinkedIn Mix Forum Wiki Technorati Tags: Frank Munz,WebLogic Recipes,WebLogic,WebLogic Community,Oracle,OPN,Jürgen Kress

    Read the article

  • Timeout when trying to install ubuntu 12.04

    - by Oskars
    When i click on "try ubuntu" after booting it from a USB the screen goes black and a lot of text comes out. But after a while it says "timeout: killing /blablabalba" or something like and it just keep on saying that. What have I done wrong? <.< My computer is a Acer ASPIRE M5201 if that is of any importance. Edit: I noticed now that before it says "timeout" it says something about not supporting "DNS" or "Asn" or something like that. I'm not sure if it was exactly dns and asn but do this mean that I can't run ubuntu on this computer? Edit2: The exact strings are: "[sdg] Write cache: disabled, read cache: enabled, doesn't support DPO or FUA " "timeout: killing '/sbin/modprobe -bv pci:v00001002d00"(and more nubmers wich I didn't catch)

    Read the article

  • Upgrading to Code Based Migrations EF 4.3.1 with Connector/Net 6.6

    - by GABMARTINEZ
    Entity Framework 4.3.1 includes a new feature called code first migrations.  We are adding support for this feature in our upcoming 6.6 release of Connector/Net.  In this walk-through we'll see the workflow of code-based migrations when you have an existing application and you would like to upgrade to this EF 4.3.1 version and use this approach, so you can keep track of the changes that you do to your database.   The first thing we need to do is add the new Entity Framework 4.3.1 package to our application. This should via the NuGet package manager.  You can read more about why EF is not part of the .NET framework here. Adding EF 4.3.1 to our existing application  Inside VS 2010 go to Tools -> Library Package Manager -> Package Manager Console, this will open the Power Shell Host Window where we can work with all the EF commands. In order to install this library to your existing application you should type Install-Package EntityFramework This will make some changes to your application. So Let's check them. In your .config file you'll see a  <configSections> which contains the version you have from EntityFramework and also was added the <entityFramework> section as shown below. This section is by default configured to use SQL Express which won't be necesary for this case. So you can comment it out or leave it empty. Also please make sure you're using the Connector/Net 6.6.x version which is the one that has this support as is shown in the previous image. At this point we face one issue; in order to be able to work with Migrations we need the __MigrationHistory table that we don't have yet since our Database was created with an older version. This table is used to keep track of the changes in our model. So we need to get it in our existing Database. Getting a Migration-History table into an existing database First thing we need to do to enable migrations in our existing application is to create our configuration class which will set up the MySqlClient Provider as our SQL Generator. So we have to add it with the following code: using System.Data.Entity.Migrations;     //add this at the top of your cs file public class Configuration : DbMigrationsConfiguration<NameOfYourDbContext>  //Make sure to use the name of your existing DBContext { public Configuration() { this.AutomaticMigrationsEnabled = false; //Set Automatic migrations to false since we'll be applying the migrations manually for this case. SetSqlGenerator("MySql.Data.MySqlClient", new MySql.Data.Entity.MySqlMigrationSqlGenerator());     }   }  This code will set up our configuration that we'll be using when executing all the migrations for our application. Once we have done this we can Build our application so we can check that everything is fine. Creating our Initial Migration Now let's add our Initial Migration. In Package Manager Console, execute "add-migration InitialCreate", you can use any other name but I like to set this as our initial create for future reference. After we run this command, some changes were done in our application: A new Migrations Folder was created. A new class migration call InitialCreate which in most of the cases should have empty Up and Down methods as long as your database is up to date with your Model. Since all your entities already exists, delete all duplicated code to create any entity which exists already in your Database if there is any. I found this easier when you don't have any pending updates to do to your database. Now we have our empty migration that will make no changes in our database and represents how are all the things at the begining of our migrations.  Finally, let's create our MigrationsHistory table. Optionally you can add SQL code to delete the edmdata table which is not needed anymore. public override void Up() { // Just make sure that you used 4.1 or later version         Sql("DROP TABLE EdmMetadata"); } From our Package Manager Console let's type: Update-database; If you like to see the operations made on each Update-database command you can use the flag -verbose after the Update-database. This will make two important changes.  It will execute the Up method in the initial migration which has no changes in the database. And second, and very important,  it will create the __MigrationHistory table necessary to keep track of your changes. And next time you make a change to your database it will compare the current model to the one stored in the Model Column of this table. Conclusion The important thing of this walk through is that we must create our initial migration before we start doing any changes to our model. This way we'll be adding the necessary __MigrationsHistory table to our existing database, so we can keep our database up to date with all the changes we do in our context model using migrations. Hope you have found this information useful. Please let us know if you have any questions or comments, also please check our forums here where we keep answering questions in general for the community.  Happy MySQL/Net Coding!

    Read the article

  • 12.04 BCM4312 and aireplay-ng/airodump-ng

    - by Haxornator
    First off, I've read through all of the posts regarding BCM4312 on the forums but haven't been able to get any help. Basically I have a Dell Inspiron 1564 which I've installed 12.04 on and for the most part everything works fine but now that I'm trying to use more in depth utilities such as aireplay and airodump I'm coming across what I believe to be a driver problem thats not allowing compatibility for these programs. Does anyone out there have any suggestions how to resolve this? This is the error I receive: root@Haxornator:~/aircrack/aircrack-ng-1.1# airodump-ng eth2 ioctl(SIOCSIWMODE) failed: Invalid argument ARP linktype is set to 1 (Ethernet) - expected ARPHRD_IEEE80211, ARPHRD_IEEE80211_FULL or ARPHRD_IEEE80211_PRISM instead. Make sure RFMON is enabled: run 'airmon-ng start eth2 <#' Sysfs injection support was not found either.

    Read the article

< Previous Page | 342 343 344 345 346 347 348 349 350 351 352 353  | Next Page >