Search Results

Search found 23233 results on 930 pages for 'feature request'.

Page 624/930 | < Previous Page | 620 621 622 623 624 625 626 627 628 629 630 631  | Next Page >

  • osx 10.6 dhcp client-id option

    - by Clustermagnet
    Trying to join a osx machine on a dhcp network which forces certain client IDs. Even if I was to modify the client-id via the network properties, the DHCP server is not accepting this request, since Windows and Apple send this via different options You can set the ClassID in the Network Control Panel in the DHCP Client ID box. The trick lies in setting up your DHCP server to recognize it. Windows XP sends the DHCP class ID via DHCP option 77, and OS X sends it via option 61. You'll have to set your DHCP server to check for option 61, +with an offset of 1 and a length of 9+. That's the tricky bit. So, without modifying the DHCP server... (which does not belong to me), is there anything that can be done on OSX to modify the client-id option to be 77?

    Read the article

  • Is it bad practice for a module to contain more information than it needs?

    - by gekod
    I just wanted to ask for your opinion on a situation that occurs sometimes and which I don't know what would be the most elegant way to solve it. Here it goes: We have module A which reads an entry from a database and sends a request to module B containing ONLY the information from the entry module B would need to accomplish it's job (to keep modularity I just give it the information it needs - module B has nothing to do with the rest of the information from the read DB entry). Now after finishing it's job, module B has to reply to a module C if it succeeded or failed. To do this module B replies with the information it has gotten from module A and some variable meaning success or fail. Now here comes the problem: module C needs to find that entry again BUT the information it has gotten from module B is not enough to uniquely find the exact same entry again. I don't think that module A giving more information to module B which it doesn't need to do it's job but which it could then give back to module C would be a good practice because this would mean giving some module information it doesn't really need. What do you think?

    Read the article

  • Can I replace a router and DHCP server without disrupting traffic?

    - by SRobertJames
    I have a device which acts as a router and DHCP server. I'm replacing it, and would like to minimize down time. If I unplug it, and plug in a different device with the same IP, will all the PCs with DHCP leases keep on working? (I have DHCP Conflict detect on, so it shouldn't reassign a DHCP address already used). What if I want to change the IP (new subnet) - is there anyway to tell all the clients (Windows PCs) to release their DHCP leases and request new ones in a minute? If before unplugging the old device, I have it release all DHCP leases, will the Windows PCs automatically ask for new DHCP addresses?

    Read the article

  • Logging full network connection like in browser

    - by pokemarine
    Can I set my browser or download a special application to log/download everything that is going through network while browsing a site? When I open a website in Firefox, it downloads the files to temporary folders, but I need everything, and by everything I mean full Ajax request logging, image downloading...etc Every file that is downloaded while browsing sites (and not temporary, I need them later, so It should log and categorize it somehow). Otherwise: I want to automatically save everything you can see in the browser's network tab (after pressing F12) and not just the response, also need the information about what it sent and what it got back.

    Read the article

  • wildcard host name bindings for multiple subdomains in multiple sites on IIS7 with a single IP address

    - by orca
    Situation: I have a single windows 2008 server with a single public IP address. I have multiple domains with wildcard A records pointing to the single IP address. I need each domain to be hosted by a different web site. (i.e. www.domain1.com by site domain1site) I need domain1.com to act like www.domain1.com I need each site to be able to have multiple subdomains (i.e. www.domain1.com, abc.domain1.com, xyz.domain1.com) Not relevant yet here it goes, I plan to handle each subdomain by a different application hosted in the same site (i.e. application /xyz in domain1site) However I found out that IIS7 does not support creating web sites with wildcard host name binding and setting it without any subdomain (i.e. domain1.com) does not work, even for www.domain1.com. Is there a simple solution? Does any IIS Extension like Application Request Routing provide such capability?

    Read the article

  • LDAP ACI Debugging

    - by user13332755
    If you've ever wondered which ACI in LDAP is used for a special ADD/DELETE/MODIFY/SEARCH request you need to enable ACI debugging to get details about this. Edit/Modify dse.ldifnsslapd-infolog-area: 128nsslapd-infolog-level: 1ACI Logging will be placed at 'errors' file, looks like: [22/Jun/2011:15:25:08 +0200] - INFORMATION - NSACLPlugin - conn=-1 op=-1 msgId=-1 -  Num of ALLOW Handles:15, DENY handles:0 [22/Jun/2011:15:25:08 +0200] - INFORMATION - NSACLPlugin - conn=-1 op=-1 msgId=-1 -  Processed attr:nswmExtendedUserPrefs for entry:uid=mparis,ou=people,o=vmdomain.tld,o=isp [22/Jun/2011:15:25:08 +0200] - INFORMATION - NSACLPlugin - conn=-1 op=-1 msgId=-1 -  Evaluating ALLOW aci index:33 [22/Jun/2011:15:25:08 +0200] - INFORMATION - NSACLPlugin - conn=-1 op=-1 msgId=-1 -  ALLOW:Found READ ALLOW in cache [22/Jun/2011:15:25:08 +0200] - INFORMATION - NSACLPlugin - conn=-1 op=-1 msgId=-1 -  acl_summary(main): access_allowed(read) on entry/attr(uid=mparis,ou=people,o=vmdomain.tld,o=isp, nswmExtendedUserPrefs) to (uid=msg-admin-redzone.vmdomain.tld-20100927093314,ou=people,o=vmdomain.tld,o=isp) (not proxied) (reason: result cached allow , deciding_aci  "DA anonymous access rights", index 33)

    Read the article

  • FTP Logs in IIS 7.5

    - by Jacob84
    I know this is weird, but the thruth is that I can't find the FTP logs in one IIS 7.5 Server. In the IIS Management Console, I've gone to the server, click on FTP Logging that appears inside FTP group (with other options like FTP Messages and FTP Request filtering). Seems that the configured folder for logs is: C:\InetPubFolder\logs\LogFiles If I go there, I can find a lot of folders with the structure W3SVC#, where # it's a number. They all contain logs, but they are HTTP logs, plenty of GET and POST verbs. Am I missing something? The server contains a lot of domains and It's hard to find.

    Read the article

  • Best Method/Library For Remote Authentication

    - by Mike
    I have a web app that has a REST API interface: http://api.example.com/core that uses API Keys and domain specific keys (key has to be used on the specified domain). I then will have several client sites with ajax forms where we will require users to sign in before being able to submit the form. This form will add data to a table, and submit an email to several recipients along with checking credentials. This form will use an ajax submit to our REST API. All Communication to/from the API is over SSL Ideal Flow: Visitor Fills Form Out -> Enters User/pass -> Submits Form -> ajax request to REST API -> API Verifies credentials -> does CRUD -> sends emails -> returns 200/403 -> perform DOM manipulation based on return code in ajax call Are there any libraries in PHP that currently do something to this similarly? Would OAuth be a good use for this scenario? Languages used are: js/html/css/php/MySQL

    Read the article

  • Getting file system free space

    - by Fred Riley
    This isn't a problem as such, more a request for information based on ignorance of the Linux filesystem. The very short question is: How do I find out how much free and used space there is on the volume from which Ubuntu is running? More detail: I'm running Ubuntu 12.04 from a 64Gb USB3 stick, created from booting up a year-old Ubuntu 12.04 DVD and running Startup Disk Creator. The reason for this is that the Master Boot Record on my hard disk, holding Windoze 7, has gone belly-up, and whilst awaiting a recovery disk I'm running Ubunto off USB or DVD as a 'trial'. (And will continue to run Ubuntu after restoring Windoze, as I've rediscovered my love of the penguin :o)) After installing Ubuntu on the stick I ran the software update app, which downloaded some 450Mb of updates and took a couple of hours to install to the stick. A couple of times I got a message saying that disk space was short. So I looked in the file manager (or whatever it's called these days) and couldn't see the stick listed, just: SYSTEM hard disk (listed as 479Gb Filesystem) two other partitions that had been created by Windoze "4.3GB Filesystem" which when I try to open gives the error "Could not find /cow", and when I try to unmount it tells me I can't because it's not mounted - D'OH!! Edit: screenshot of file manager Edit: screenshot of low disk space warning What I can't see is the USB stick from which I'm running Ubuntu. Where's it gone, anybody know? This is tangentially related to a previous question of mine about system tools, in that I'm trying to get control and knowledge of the system in the newest incarnation of Ubuntu.

    Read the article

  • Apache LocationMatch throws 500 and AddOutputFilterByType does nothing

    - by tackleberry
    I need to add below directives to apache. But I get 500 when I add these lines. <LocationMatch "^/assets/.*$"> Header unset ETag FileETag None # RFC says only cache for 1 year ExpiresActive On ExpiresDefault "access plus 1 year" </LocationMatch> Additionally response is not gzipped when I add: AddOutputFilterByType DEFLATE text/html text/css application/javascript application/x-javascript Apache version is: Server version: Apache/2.2.22 (Unix) App: rails 3.2 app When I checked response&request for gzip problem, I see that browser requested gzip: Accept-Encoding gzip, deflate but response not gzipped.

    Read the article

  • Announcing Two Papers Addressing the RPAS Fusion Client

    - by Oracle Retail Documentation Team
    Oracle Retail has published two documents to My Oracle Support addressing the Retail Predictive Application Server (RPAS) Fusion Client, a web-based rich client developed using the latest Oracle Application Development Framework (ADF). The Fusion Client provides an enhanced user experience for communicating with the RPAS server. Oracle Retail Predictive Application Server Fusion Client Getting Started Guide Doc ID 1492759.1The Retail Predictive Application Server (RPAS) is a configurable platform that provides capabilities such as a multidimensional database structure, batch and online processing, a configurable user interface, a configurable calculation engine, user security, and utility functions such as importing and exporting, all on a highly scalable technical environment that can be deployed on a variety of hardware. This paper addresses typical questions that arise during setting up and deploying the Fusion Client, provides performance recommendations, and highlights the differences between the Classic Client and the Fusion Client. Oracle Retail RPAS Fusion Client Performance Issue Report Doc ID 1493747.1Performance issues can be frustrating for customers, and Oracle Retail will strive to assist you as you attempt to enhance the performance of your systems. To ensure the timeliest processing of your issue, retailers and partners are encouraged to respond as thoroughly as possible to each question in this document, which can be sent back for analysis by logging a Service Request and following typical Customer Support processes. The sections of the document solicit information about the following: Performance Issue Description Performance Issue Details System Configuration Data Application Configuration Data Performance Log Files

    Read the article

  • Network Block Device (NBD) clients for Windows or similar solutions

    - by przemoc
    Are there any NBD clients for Windows? Strangely, I cannot find any, or I am searching for them in a wrong way. Such client should be possibly a driver with front-end tool (may be a command-line one) allowing to create virtual drives and associate them with given hosts (or simply localhost) and ports where NBD servers are listening. From user perspective virtual drive should be close to what physical drive is, so it should be accessible as something like \\.\PhysicalDriveX (maybe \\.\VirtualDriveX?), be visible in Disk Management (diskmgmt.msc) and mountvol tools at least. (The only thing I found remotely close to NBD on Windows is ImDisk's proxy mode and companion tool devio, but AFAIK ImDisk only works at partition level (so no virtual drive) and devio uses different protocol.) Secondary question is: Are there any (preferably simple) Windows-specific solutions allowing creation of virtual drive delegating read/write request to user-space via some explicit way (like via TCP, IPC, DLL implementing given API, etc.)?

    Read the article

  • Command to determine whether ZooKeeper Server is Leader or Follower

    - by utrecht
    Introduction A ZooKeeper Quorum consisting of three ZooKeeper servers has been created. The zoo.cfg located on all three ZooKeeper servers looks as follows: maxClientCnxns=50 # The number of milliseconds of each tick tickTime=2000 # The number of ticks that the initial # synchronization phase can take initLimit=10 # The number of ticks that can pass between # sending a request and getting an acknowledgement syncLimit=5 # the directory where the snapshot is stored. dataDir=/var/lib/zookeeper # the port at which the clients will connect clientPort=2181 server.1=ip1:2888:3888 server.2=ip2:2888:3888 server.3=ip3:2888:3888 It is clear that one of the three ZooKeeper servers will become the Leader and the others Followers. If the Leader ZooKeeper server has been shutdown the Leader election will start again. The aim is to check if another ZooKeeper server will become the Leader if the Leader server has been shut down. Question Which command needs to be issued to check whether a ZooKeeper server is a Leader or a Follower?

    Read the article

  • how install minimum domain email piping to script in centos?

    - by Adam Ramadhan
    hello i have search google on a simple tutorial on how to make a piping email. first how does really email technically work? "stmp is a process that binds to 25, waiting for email request that goes in from another stmp process(in another server) determined by the domain MX route that will send the message to port 25 if any email goes though the MX.domain.tld" that is in a nutshell how emailing work, am i right? or there is something wrong here? second, so if im right, we need to set a SMTP server so we can receive incoming emails from MX SMTP route right? ive googled though google and found two best STMP servers from my opinion, they are EXIM and POSTFIX, can anybody give us a simple tutorial installing and setting up an email piping for a fresh installed linux/centos? example *.domain.tld -> allinonepipe.php thanks.

    Read the article

  • core.* files eating up server space (~50MB)

    - by skytreader
    I'm renting server space from someone and, upon logging in my control panel after quite sometime, noticed an abnormal spike (~50MB) in the disk usage. Upon investigating, I found a lot of core.* files scattered around my public_html directory. Each one is more than 5MB in size but no more than 6MB. The * part is all numbers (in programming regex, that should be core\.\d+). I downloaded one and checked the contents. There was a lot of balderdash characters (NUL mostly, but also a scattering of ETB, ETX, STX) but there's this block of readable text which says: This text is part of the internal format of your mail folder, and is not a real message. It is created automatically by the mail system software. If deleted, important folder data will be lost, and it will be re-created with the data reset to initial values. Pretty self-explanatory. A few blocks above the text are some more readable messages that look like logs but is sandwiched in between non printable characters. I've extracted some below. Scan not valid for mh mailboxes Bogus character 0x%x in news state Can't rewrite news state %.80s Error closing backup news state %.80s No state for newsgroup %.80s found Now, a few concerns: Am I under attack? The messages seem to be about my webmail but I don't use my personal webmail that much---only for a vanity email address and an inbox for an outdated comments system. However, lately, I seem to notice a spike in the spam for my vanity mail. (Note: the comments system is covered by a captcha but every now and then some get through. My vanity email has a spam filter but it isn't as good as I'd like). Next, if this is a feature, can I turn it off? Is it advisable to? I've only 150MB so you see why I'm fretting over a 50MB spike. Some final details: my only server-side scripts are in PHP. The directory which accumulated the most number of these core files is the one containing the Wordpress-managed subdomain of my site. I manage my server through CPanel. Lastly, I decided to delete this files and after some checking nothing seems amiss in my websites nor in my mail. They are indeed the ones responsible for the ~50MB spike as my disk space usage is back to expected.

    Read the article

  • How did Google get on my Mac?

    - by SamGoody
    Am running a MacBook Pro, and have never installed Chrome, Google Earth, or anything blatantly Google. Just installed Little Snitch (are there no good free firewalls for Mac?) and see that CURL is sending to Google every few minutes, as is a request to Google update and more. Little Snitch doesn't say what program setup these requests. So, how do I find out how G got on my machine, why is it sending so many requests (every minute or so) and how do I remove it (and is it there for reasons other than to help G spy on me)?

    Read the article

  • How to host ASP.NET application externally?

    - by Josh
    I have an ASP.NET application that I can get to locally by going to 192.168.1.102:81/TestApp. I would like to host the application externally by going to domain.com:81/TestApp (I already have my domain pointing to my router and this works fine - I have apache running on port 80 on another server). I modified the router settings to point any request coming through port 81 to 192.168.1.102. I am still having trouble accessing the ASP.NET site (I get the error message that "This link appears to be broken"). Am I missing something? How can I redirect domain.com:81/TestApp to my ASP.NET application? Thanks.

    Read the article

  • Retrieving database column using JSON [migrated]

    - by arokia
    I have a database consist of 4 columns (id-symbol-name-contractnumber). All 4 columns with their data are being displayed on the user interface using JSON. There is a function which is responisble to add new column to the database e.g (countrycode). The coulmn is added successfully to the database BUT not able to show the new added coulmn in the user interface. Below is my code that is displaying the columns. Can you help me? table.php $(document).ready(function () { // prepare the data var theme = getDemoTheme(); var source = { datatype: "json", datafields: [ { name: 'id' }, { name: 'symbol' }, { name: 'name' }, { name: 'contractnumber' } ], url: 'data.php', filter: function() { // update the grid and send a request to the server. $("#jqxgrid").jqxGrid('updatebounddata', 'filter'); }, cache: false }; var dataAdapter = new $.jqx.dataAdapter(source); // initialize jqxGrid $("#jqxgrid").jqxGrid( { source: dataAdapter, width: 670, theme: theme, showfilterrow: true, filterable: true, columns: [ { text: 'id', datafield: 'id', width: 200 }, { text: 'symbol', datafield: 'symbol', width: 200 }, { text: 'name', datafield: 'name', width: 100 }, { text: 'contractnumber', filtertype: 'list', datafield: 'contractnumber' } ] }); }); data.php <?php #Include the db.php file include('db.php'); $query = "SELECT * FROM pricelist"; $result = mysql_query($query) or die("SQL Error 1: " . mysql_error()); $orders = array(); // get data and store in a json array while ($row = mysql_fetch_array($result, MYSQL_ASSOC)) { $pricelist[] = array( 'id' => $row['id'], 'symbol' => $row['symbol'], 'name' => $row['name'], 'contractnumber' => $row['contractnumber'] ); } echo json_encode($pricelist); ?>

    Read the article

  • Problems restoring old backups in NetBackup 6.5

    - by gharper
    I had a server that was decommissioned & replaced last year, and since the server was no longer in use, I deleted it's client & backup policy from the NetBackup Admin Console shortly afterwards. I recently got a request to restore a file from the old server, however when I specify the source client for the restore, I get an error message saying: WARNING: server (backupserver) does not contain any backups for client (oldserver) using the specified policy type (Standard) as requested by client (backupserver). [Ok] In addition to that error, I can't seem to run a Client Backup report on the old client any more to determine what tapes I need to recall in order to re-index and restore the files... My questions: Does deleting the client somehow remove NetBackups ability to ever restore files from the old system, even if the backups have a retention period of infinity? Is there a way to restore the file from the tape, assuming I can figure out which tape I need?

    Read the article

  • R12 Diagnostic Script for Purchasing Encumbrance Issues

    - by Oracle_EBS
    Do you have a Release 12 Purchasing document with an accounting encumbrance error?  Get all the relevant data in one step using the new diagnostic in DOC ID: 1483743.1 -  ‘R12: Diagnostic Script to help troubleshoot Purchasing Encumbrance Issues’.   Avoid the back and forth pinging with support for data collection.   Query the document id in My Oracle Support and add it to your Favorites using the star icon for quick access. The note includes when to use the script and how to use it.  The script will produce a user friendly html output that contains information relevant to encumbrance issues, along with some data validation checks to identify common data corruption issues on your document.  For example in this one diagnostic it will provide information on the following: Ø Cross Product Setup Ø Document Data Dump Ø Funds availability Ø Subledger accounting information Ø GL and AP Invoice Data Ø Debug and Trace This output is ideal for self service, as it provides known issues in the Data Validation section (related to the document) with links to key documentation.   Or the report can be uploaded to support when logging a Service Request. To see more about the diagnostic, attend our September 11, 2012 Webcast ‘Overview of Procurement Patching and New Tools for Issue Resolution’.  Visit Doc ID 1479718.1 to signup.  Note: This topic will not be listed as it has been just added.

    Read the article

  • Unity , libgdx, or something else to develop my first game for Android?

    - by capcom
    I want to start by saying that I absolutely love Unity (even more when I team it up with Blender). I really want to start developing games for Android, but it seems like Unity poses way too many roadblocks in terms of which devices it supports (and even if it does support them, it doesn't work well on all of them). I've been looking around for alternatives, and found something called libgdx. Well, it's nothing like Unity unfortunately, but at least it seems like I may be able to reach a larger audience in the market. I'd like to start by making 2D games, but with 3D graphics (say, imported from Blender). I can do this very easily in Unity, and it seems like it should be alright with libgdx too. But I really want to know if ditching Unity is a smart idea, considering how comfortable I am with it already, and how much I like it. Finally, is libgdx something you would recommend considering my requirements/situation? BTW, I am quite familiar with Eclipse too. Many thanks. Feel free to request further details.

    Read the article

  • Ubuntu One - Files API - Cloud - More detailed info somehwere?

    - by Brian McCavour
    I am just starting on a mobile app for Ubuntu One, and I'm reviewing the info at https://one.ubuntu.com/developer/files/store_files/cloud I find the information a bit lacking though. It's a nice reference, but for someone not familiar with it, I had to goggle search to find out what a "volume" was exactly (its kind of obvious, but never hurts to know the specifics) There's also things like: GET /api/file_storage/v1/volumes Return a JSON list of Volume Representations, one for each volume. A volume is a synced folder, or the Ubuntu One folder, owned by the user. Note that all volume paths begin with ~.: ... but there's no such thing as a JSON "list". Does it mean array ? And other things... So I was wondering if here existed another page with more detailed information. Maybe some sample request / responses or something? I could just write a little proof of concept app to answer some of these questions... but I prefer not to unless I have to. Thanks

    Read the article

  • Exchange 2007 mailbox reassignment

    - by John Virgolino
    I am trying to move a mailbox from one user account to another within a single AD domain. We are using Exchange 2007. I have followed these steps: Disable the account From "Disconnected Mailbox" container, connect to the new account I get a success message When I try to login to OWA using the new user account, I get this message: Outlook Web Access could not connect to Microsoft Exchange. If the problem continues, contact technical support for your organization. Request Url: https://mail.somedomain.com:443/owa/default.aspx User host address: 1.2.3.4 Exception Exception type: Microsoft.Exchange.Data.Storage.ConnectionFailedTransientException Exception message: Cannot open mailbox /o=cgsexchangeorganization/ou=exchange administrative group (fydibohf23spdlt)/cn=recipients/cn=someuser. NOTE: I changed some identifying information for security purposes. I have tried multiple times and get to the same place. When I login to OWA with old account, I get an error that the mailbox cannot be found, which makes total sense. Does anybody have any ideas on this? Thanks!

    Read the article

  • SSH reverse tunnel to monitor and manage remote devices

    - by acid_crucifix
    I have a set a distributed set of devices running Ubuntu 12.04 that I am distributing to clients. I would like to manage them remotely. They may not have fixed IPs and potentially might be behind firewalls. What I am planning to do is have the devices (permanently connected to the net) poll a request URL and based on the response open a reverse tunnel to my server, so that I can access them via that tunnel. Most of what I read about reverse tunnel over SSH is for single use cases and very little about heavy production usage. Is there some reason for this, security issues? or stability? Any help would be much obliged.

    Read the article

  • Double try_files to solve the nginx's "No input file specified" issue

    - by Howard
    I am following the nginx's wiki (http://wiki.nginx.org/WordPress) to setup my wordpress location / { try_files $uri $uri/ /index.php?$args; } By using the above lines, when a static file which is not found it will redirect to index.php of wordpress, that is okay but.. Problem: When I request an non-existence php script, e.g. http://www.example.com/foo.php, nginx will give me No input file specified I want nginx to return 404 instead of the above message, so in the main fcgi config, I add the 2nd try_files location ~ \.php$ { try_files $uri =404; fastcgi_split_path_info ^(.+\.php)(/.+)$; include /etc/nginx/fastcgi_params; ... } And this worked, but I am looking if there are any better way to handle it?

    Read the article

< Previous Page | 620 621 622 623 624 625 626 627 628 629 630 631  | Next Page >